Comparative effectiveness of membrane technologies and disinfection methods for virus elimination in water: A review

Chao Chen 1Lihui Guo 2Yu Yang 3Kumiko Oguma 4Li-An Hou 5


The pandemic of the 2019 novel coronavirus disease (COVID-19) has brought viruses into the public horizon. Since viruses can pose a threat to human health in a low concentration range, seeking efficient virus removal methods has been the research hotspots in the past few years. Herein, a total of 1060 research papers were collected from the Web of Science database to identify technological trends as well as the research status. Based on the analysis results, this review elaborates on the state-of-the-art of membrane filtration and disinfection technologies for the treatment of virus-containing wastewater and drinking water. The results evince that membrane and disinfection methods achieve a broad range of virus removal efficiency (0.5-7 log reduction values (LRVs) and 0.09-8 LRVs, respectively) that is attributable to the various interactions between membranes or disinfectants and viruses having different susceptibility in viral capsid protein and nucleic acid. Moreover, this review discusses the related challenges and potential of membrane and disinfection technologies for customized virus removal in order to prevent the dissemination of the waterborne diseases.

Keywords: Disinfection; Drinking water treatment; Membrane; Virus removal; Wastewater treatment.

Relationships between regulated DBPs and emerging DBPs of health concern in U.S. drinking water

Stuart W.Krasner1⁎⁎AiJia1Chih-Fen T.Lee1RahaShirkhani1Joshua M.Allen2⁎⁎⁎Susan D.Richardson2Michael J.Plewa34


A survey was conducted at eight U.S. drinking water plants, that spanned a wide range of water qualities and treatment/disinfection practices. Plants that treated heavily-wastewater-impacted source waters had lower trihalomethane to dihaloacetonitrile ratios due to the presence of more organic nitrogen and HAN precursors. As the bromide to total organic carbon ratio increased, there was more bromine incorporation into DBPs. This has been shown in other studies for THMs and selected emerging DBPs (HANs), whereas this study examined bromine incorporation for a wider group of emerging DBPs (haloacetaldehydes, halonitromethanes). Moreover, bromine incorporation into the emerging DBPs was, in general, similar to that of the THMs. Epidemiology studies that show an association between adverse health effects and brominated THMs may be due to the formation of brominated emerging DBPs of heath concern. Plants with higher free chlorine contact times before ammonia addition to form chloramines had less iodinated DBP formation in chloraminated distribution systems, where there was more oxidation of the iodide to iodate (a sink for the iodide) by the chlorine. This has been shown in many bench-scale studies (primarily for iodinated THMs), but seldom in full-scale studies (where this study also showed the impact on total organic iodine. Collectively, the THMs, haloacetic acids, and emerging DBPs accounted for a significant portion of the TOCl, TOBr, and TOI; however, ∼50% of the TOCl and TOBr is still unknown. The correlation of the sum of detected DBPs with the TOCl and TOBr suggests that they can be used as reliable surrogates.

Occurrence of nitrosamines and their precursors in North American drinking waters.

Krasner, S. W.Roback, S.; (…); Bukhari, Z.

2020 | AWWA Water Science

Eight N-nitrosamines were measured at 37 water plants in the United States and Canada. Five tobacco-specific nitrosamines (TSNAs) were measured in selected waters. N-Nitrosodimethylamine (NDMA) was preferentially formed in chloraminated systems (maximum detention time: median 4.4ng/L). A small amount was detected in some chlorinated systems (90th percentile <2.0 ng/L). After ozone (before chloramines), NDMA was sometimes detected (90th percentile 2.9 ng/L), suggesting that the ozone did not react with precursors to form NDMA. The chloramine plants that temporarily switched to chlorine typically produced less NDMA (Plant 29 reduced NDMA formation, on average, from 34 to 4 ng). More NDMA was produced during spring runoff, when there were elevated levels of ammonia and NDMA precursors in the source water. More NDMA was formed when there were higher levels of poly (diallyldimethylammonium chloride) (polyDADMAC) used. N-Nitrosomorpholene was found to be a contaminant and not a disinfection byproduct; it did not increase during chloramination. TSNAs were produced during spring runoff; source water ammonia impacted the chlor (am) ine chemistry. © 2020 American Water Works Association

Associations between private well water and community water supply arsenic concentrations in the conterminous United States

Authors: Spaur, MayaLombard, Melissa A.Ayotte, Joseph D.; et al.


Geogenic arsenic contamination typically occurs in groundwater as opposed to surface water supplies. Groundwater is a major source for many community water systems (CWSs) in the United States (US). Although the US Environmental Protection Agency sets the maximum contaminant level (MCL enforceable since 2006: 10 μg/L) for arsenic in CWSs, private wells are not federally regulated. We evaluated county-level associations between modeled values of the probability of private well arsenic exceeding 10 μg/L and CWS arsenic concentrations for 2231 counties in the conterminous US, using time invariant private well arsenic estimates and CWS arsenic estimates for two time periods. Nationwide, county-level CWS arsenic concentrations increased by 8.4 μg/L per 100% increase in the probability of private well arsenic exceeding 10 μg/L for 2006-2008 (the initial compliance monitoring period after MCL implementation), and by 7.3 μg/L for 2009-2011 (the second monitoring period following MCL implementation) (1.1 μg/L mean decline over time). Regional differences in this temporal decline suggest that interventions to implement the MCL were more pronounced in regions served primarily by groundwater. The strong association between private well and CWS arsenic in Rural, American Indian, and Semi Urban, Hispanic counties suggests that future research and regulatory support are needed to reduce water arsenic exposures in these vulnerable subpopulations. This comparison of arsenic exposure values from major private and public drinking water sources nationwide is critical to future assessments of drinking water arsenic exposure and health outcomes.

What Is Groundwater? How to Manage and Protect Groundwater Resources

Authors: Lachassagne, Patrick

Abstract: Among the water resources on earth, groundwater is a resource hidden in the rocks of the earth’s crust. For various reasons, notably the fact that this water is not directly visible but also as a consequence of education and longstanding traditions, the properties and physical laws governing groundwater are not well known outside the circle of hydrogeologists, the scientists specialists of the survey, management, and protection of groundwater resources. This resource has lots of advantages, notably when compared to surface water, and is thus largely used worldwide for many purposes: agriculture, tap water, industries, bottling, etc. In fact, this resource is available year-long, even during the dry season and in arid countries, and is well protected from surface contaminations. It needs, however, to be appropriately managed and protected to ensure its sustainability (quantity and quality). Thus, this study intends to provide the basics of the groundwater science, “hydrogeology.” It is illustrated by examples taken from the Evian Natural Mineral Water, that is groundwater, and the way it is managed and protected. The groundwater resource is a sustainable water resource belonging to the earth’s water cycle, which flows thanks to the natural energy provided by the sun. The main physical processes of the groundwater water cycle are the infiltration of rainwater into the soil, its slow flow within the pervious rocks from the earth’s crust, called “aquifers,” and finally its natural outflow at springs and into rivers. It can also be reached with man-made wells and pumped. Groundwater contains dissolved minerals that are mostly the results of interactions between the water and the aquifers’ rocks.


Effect of concentration on virus removal for ultrafiltration membrane in drinking water production

Authors: Jacquet, N.Wurtzer, S.Darracq, G.; et al.

Removal of pathogenic microorganisms as viruses during drinking water production was evaluated by ultrafiltration. Two enteric viruses (ADV 41 and CV-B5) were compared to the MS2 bacteriophage, largely used in literature and by membrane producers as enteric virus surrogate. The effect of feed concentration of viruses on the ultrafiltration efficiency has been assessed. For the three viruses, low retentions about 1 log were observed at the lowest concentrations. At higher concentrations, an increase of removal up to 3.0 log for CV-B5 and MS2 phage and 3.5 log for ADV 41 was observed. These results highlight the potential overestimation of UF efficiency during laboratory experiments realized at high concentrations, compared to low concentrations found in environmental resources used for drinking water production. Virus removals with Evian water and real groundwater were compared and groundwater achieved similar or slightly higher removals for the three viruses. Finally, impact of membrane ageing after chlorine exposure was checked. It was observed that membrane degradations, visible by a water permeability increase with exposure dose did not affect the removal of viruses at low feed concentrations.

Tracking reduction of water lead levels in two homes during the Flint Federal Emergency.

By: Mantha, A.Tang, M.Pieper, K. J.; et al.

Water Research X  Volume: ‏ 7   Pages: ‏ 100047   Published: ‏ 2020

Free Full Text from Publisher

 Close Abstract


A Federal Emergency was declared in Flint, MI, on January 16,2016, 18-months after a switch to Flint River source water without phosphate corrosion control. Remedial actions to resolve the corresponding lead in water crisis included reconnection to the original Lake Huron source water with orthophosphate, implementing enhanced corrosion control by dosing extra orthophosphate, a “Flush for Flint” program to help clean out loose leaded sediment from service lines and premise plumbing, and eventually lead service line replacement. Independent sampling over a period of 37 months (January 2016-February 2019) was conducted by the United States Environmental Protection Agency and Virginia Tech to evaluate possible human exposure via normal flow (~2-3 L/min) sampling at the cold kitchen tap, and to examine the status of loose deposits from the service line and the premise plumbing via high-velocity flushing (~12-13 L/min) from the hose bib. The sampling results indicated that high lead in water persisted for more than a year in two Flint homes due to a large reservoir of lead deposits. The effects of a large reservoir of loose lead deposits persisted until the lead service line was completely removed in these two anomalous homes. As water conservation efforts are implemented in many areas of the country, problems with mobile lead reservoirs in service lines are likely to pose a human health risk. All rights reserved, Elsevier.

(Re)theorizing the Politics of Bottled Water: Water Insecurity in the Context of Weak Regulatory Regimes

Raul Pacheco-Vega – Public Administration Division, Centro de Investigación y Docencia Económicas (CIDE), Sede Región Centro, Aguascalientes 20313 Ciudad de México 01210, Mexico

DOI: 10.3390/w11040658


Water insecurity in developing country contexts has frequently led individuals and entire communities to shift their consumptive patterns towards bottled water. Bottled water is sometimes touted as a mechanism to enact the human right to water through distribution across drought-stricken or infrastructure-compromised communities. However, the global bottled water industry is a multi-billion dollar major business. How did we reach a point where the commodification of a human right became not only commonly accepted but even promoted? In this paper, I argue that a discussion of the politics of bottled water necessitates a re-theorization of what constitutes “the political” and how politics affects policy decisions regarding the governance of bottled water. In this article I examine bottled water as a political phenomenon that occurs not in a vacuum but in a poorly regulated context. I explore the role of weakened regulatory regimes and regulatory capture in the emergence, consolidation and, ultimately, supremacy of bottled water over network-distributed, delivered-by-a-public utility tap water. My argument uses a combined framework that interweaves notions of “the political”, ideas on regulatory capture, the concept of “the public”, branding, and regulation theory to retheorize how we conceptualize the politics of bottled water. © 2019 by the authors. Licensee MDPI, Basel, Switzerland.

Development and application of relevance and reliability criteria for water treatment removal efficiencies of chemicals of emerging concern

Authors: Fischer, A; Wezel, AP; Hollender, J; Cornelissen, E; Hofman, R; van der Hoek, JP WATER RESEARCH

DOI: 10.1016/j.watres.2019.05.088


With the growth in production and use of chemicals and the fact that many end up in the aquatic environment, there is an increasing need for advanced water treatment technologies that can remove chemicals of emerging concern (CECs) from water. The current lack of a homogenous approach for testing advanced water treatment technologies hampers the interpretation and evaluation of CEC removal efficiency data, and hinders informed decision making by stakeholders with regard to which treatment technology could satisfy their specific needs. Here a data evaluation framework is proposed to improve the use of current knowledge in the field of advanced water treatment technologies for drinking water and wastewater, consisting of a set of 9 relevance criteria and 51 reliability criteria. The two criteria sets underpin a thorough, unbiased and standardised method to select studies to evaluate and compare CEC removal efficiency of advanced water treatment technologies in a scientifically sound way. The relevance criteria set was applied to 244 papers on removal efficiency, of which only 20% fulfilled the criteria. The reliability criteria were applied to the remaining papers. In general these criteria were fulfilled with regards to information on the target compound, the water matrix and the treatment process conditions. However, there was a lack of information on data interpretation and statistics. In conclusion, a minority of the evaluated papers are suited for comparison across techniques, compounds and water matrixes. There is a clear need for more uniform reporting of water treatment studies for CEC removal. In the future this will benefit the selection of appropriate technologies. (C) 2019 The Authors. Published by Elsevier Ltd.


A Perspective on the History of Environmental Regulations-Successes and Challenges in Reclaiming Polluted Waters

By:Bhatti, MI (Bhatti, M. Ilyas)1 ]


Edited by:Scott, GF; Hamilton, W

Pages: 160-165

Published: 2019

Document Type:Proceedings Paper


Conference: World Environmental and Water Resources Congress / 19th Annual Congress of the Environmental-and-Water-Resources-Institute (EWRI) / EWRI History and Heritage Symposium

Location: Pittsburgh, PA

Date: MAY 19-23, 2019

Sponsor(s):Environm & Water Resources Inst; Amer Soc Civil Engineers, Environm & Water Resources Inst; Amer Soc Civil Engineers


There is always a political debate how environmental regulations inhibit economic progress. However, if we look closely at the situation prior to the enactment of major congressional actions in the early 1970s, one can understand that the pollution and contamination of our environment had reached an epidemic level that propelled the environmental movement of the 1960s. Rachel Carson in her book titled Silent Spring captured the impact of indiscriminate use of pesticides such as DDT. Her book served as a wake-up call for the nation to come together to fight for the protection of our environmental resources before environmental pollution would completely destroy our water we drink and the air we breathe. The author will focus on the environmental movement of the 1960s in the United States that resulted in the passage of a number of important pieces of legislation such as the Clean Water Act and the Safe Drinking Water Act. The author will examine the success of the early planning efforts undertaken under Section 208 of the Clean Water Act in developing wastewater management plans throughout the United States with specific focus on wastewater treatment and disposal activities in Massachusetts. This paper will also examine the deterioration of our water and wastewater infrastructure after the successes of the 1980s, and how a more robust policy and leadership is needed at the national and local level to check this trend. Not only public health is endangered but economic gains could be threatened.

Efficacy of Flushing and Chlorination in Removing Microorganisms from a Pilot Drinking Water Distribution System

By:van Bel, N (van Bel, Nikki)1 ] ; Hornstra, LM (Hornstra, Luc M.)1 ] ; van der Veen, A (van der Veen, Anita)1 ] ; Medema, G (Medema, Gertjan)1,2 ]


Volume: 11

Issue: 5

Article Number: 903

DOI: 10.3390/w11050903

Published: MAY 2019

Document Type:Article


To ensure delivery of microbiologically safe drinking water, the physical integrity of the distribution system is an important control measure. During repair works or an incident the drinking water pipe is open and microbiologically contaminated water or soil may enter. Before taking the pipe back into service it must be cleaned. The efficacy of flushing and shock chlorination was tested using a model pipe-loop system with a natural or cultured biofilm to which a microbial contamination (Escherichia coli, Clostridium perfringens spores and phiX174) was added. On average, flushing removed 1.5-2.7 log microorganisms from the water, but not the biofilm. In addition, sand added to the system was not completely removed. Flushing velocity (0.3 or 1.5 m/s) did not affect the efficacy. Shock chlorination (10 mg/L, 1-24 h) was very effective against E. coli and phiX174, but C. perfringens spores were partly resistant. Chlorination was slightly more effective in pipes with a natural compared to a cultured biofilm. Flushing alone is thus not sufficient after high risk repair works or incidents, and shock chlorination should be considered to remove microorganisms to ensure microbiologically safe drinking water. Prevention via hygienic working procedures, localizing and isolating the contamination source and issuing boil water advisories remain important, especially during confirmed contamination events.


Epidemics caused by contamination of drinking water supplied by public water supply systems in terms of current legislation

Vít Vlček
Central Institute for Supervising and Testing in Agriculture, Brno, Czech Republic


Objectives. This paper describes and comments on contemporary legislation concerning prevention of epidemics caused by contaminated drinking water from public water supplies in the Czech Republic. Methods. Suggestions are made for removing existing legislative shortcomings, clarifying diction of existing laws and expanding sanctions and penalties for health injury caused by providers and operators of public drinking water. Results. The author reflects on improving legislation concerning the compensation of victims of contaminated water with reference to the aftermath of a local epidemic in the Dejvice District of Prague. Conclusion. The issues raised should be addressed since better legislation can significantly contribute to the limitation of water-borne epidemics and their consequences.


Seasonal Variation of Water Quality in Unregulated Domestic Wells

Mel and Enid Zuckerman College of Public Health, University of Arizona, 1295 N. Martin Ave., Tucson, AZ 85724, USA
Friends of the Santa Cruz River, P.O. Box 4275, Tubac, AZ 85646, USA
Department of Soil, Water & Environmental Science, University of Arizona, Tucson, AZ 85721-0038, USA
Department of Chemistry and Biochemistry, University of Arizona, Tucson, AZ 85721-0041, USA
Author to whom correspondence should be addressed.
Int. J. Environ. Res. Public Health 201916(9), 1569;
Received: 12 April 2019 / Revised: 30 April 2019 / Accepted: 1 May 2019 / Published: 5 May 2019
(This article belongs to the Special Issue Drinking Water and Health Risks)


In the United States (U.S.), up to 14% of the population depend on private wells as their primary drinking water source. The U.S. government does not regulate contaminants in private wells. The goals of this study were to investigate the quality of drinking water from unregulated private wells within one mile (1.6 kilometers) of an effluent-dominated river in the arid Southwest, determine differences in contaminant levels between wet and dry seasons, and identify contributions from human sources by specifically measuring man-made organic contaminants (perfluorooctanoic acid (PFOA), perfluorooctane sulfate (PFOS), and sucralose). Samples were collected during two dry seasons and two wet seasons over the course of two years and analyzed for microbial (Escherichia coli), inorganic (arsenic, cadmium, chromium, copper, lead, mercury, nitrate), and synthetic organic (PFOA, PFOS, and sucralose) contaminants. Arsenic, nitrate, and Escherichia coli concentrations exceeded their respective regulatory levels of 0.01 mg/L, 10 mg/L, and 1 colony forming unit (CFU)/100 mL, respectively. The measured concentrations of PFOA and PFOS exceeded the respective Public Health Advisory level. Arsenic, PFOA, PFOS, and sucralose were significantly higher during the dry seasons, whereas E. coli was higher during the wet seasons. While some contaminants were correlated (e.g., As and Hg ρ = 0.87; PFOA and PFOS ρ = 0.45), the lack of correlation between different contaminant types indicates that they may arise from different sources. Multi-faceted interventions are needed to reduce exposure to drinking water above health-based guidelines.


By:Damian, L (Damian, Laura)1 ] ; Patachia, S (Patachia, Silvia)1 ] ; Scarneciu, I (Scarneciu, Ioan)2 ]


Volume: 18

Issue: 5

Pages: 1089-1095

Published: MAY 2019

Document Type:Article


This study presents the influence of the storage recipients’ material and of the use and the type of stirring on the drinking water quality. The kinetics of drinkable water quality alteration under stationary conditions and under magnetic and sonical stirring have been monitored for a two weeks period. The microbiological parameters (total number of germs developed at 37 degrees C and 22 degrees C, lactose-positive and lactose-negative bacteria, coliform bacteria and Echerichia coli), as well as the physico-chemical ones (turbidity and chlorine amount) have been determined on a daily basis, indicating different alteration degrees of the drinkable water, as a function of storage period and regime. It was found that glass not stimulate microbial growth while polyethylene recipients represents a high risk factor from the bacterial growth point of view. Mechanical stirring as well as sonication are able to significantly reduce the formation of the biofilm on the wall of the storage tanks, irregarding of the material from which the recipients are made of. Sonication has been proven to be inefficient for water storage in polyethylene recipients, due to the increase of the temperature and consequently of the planktonic bacteria activity.


Understanding nitrate contamination based on the relationship between changes in groundwater levels and changes in water quality with precipitation fluctuations

By:Kawagoshi, Y (Kawagoshi, Yasunori)1 ] ; Suenaga, Y (Suenaga, Yuichi)2 ] ; Chi, NL (Nguyen Linh Chi)3 ] ; Hama, T (Hama, Takehide)1 ] ; Ito, H (Ito, Hiroaki)1 ] ; Duc, LV (Luong Van Duc)1 ]


Volume: 657

Pages: 146-153

DOI: 10.1016/j.scitotenv.2018.12.041

Published: MAR 20 2019

Document Type:Article


There are growing concerns about nitrate contamination in Kumamoto City, where >700,000 people completely depend on groundwater as a source of drinking water. We found that some groundwater samples showed considerably different nitrate concentrations although their sampling locations were close to one another, and we speculated that this phenomenon was due to the differences in subsurface geological properties. In order to verify this hypothesis, we carried out temporally intensive long-term monitoring of the groundwater levels and water qualities at three of the closely related sampling wells, and the results revealed that the changes in water level and water quality were different at each well. The water level at well T1, where nitrate concentrations ranged from 12 to 26 mg N/L, showed a significantly sensitive and unique response to heavy rain, which indicated that the subsurface at this site might be highly permeable; this would have allowed for the influent water to easily reach the groundwater aquifer over a short period. However, wells T2 and T3, which were located within 0.6 and 1.9 km from well T1, respectively, had nitrate concentrations that were lower than that in well T1 (45-8.0 mg N/L) and showed only gradual responses to heavy rain. These observations suggest that the highly permeable subsurface properties in the vicinity of well T1 contributed to the more serious nitrate contamination in well T1 than those at wells T2 and T3. This study demonstrates the importance of temporally intensive, long-term monitoring for capturing changes in groundwater level and water quality with precipitation fluctuations, and we showed how this approach can lead to a better understanding of the nitrate contamination situation. (C) 2018 Elsevier B.V. All rights reserved.

Review of perchlorate occurrence in large public drinking water systems in the United States of America

By:Luis, SJ (Luis, Steven J.)1 ] ; Miesner, EA (Miesner, Elizabeth A.)2 ] ; Enslin, CL (Enslin, Clarissa L.)1 ] ; Heidecorn, K (Heidecorn, Keith)3 ]


Volume: 19

Issue: 3

Pages: 681-694

DOI: 10.2166/ws.2018.135

Published: MAY 2019

Document Type:Review


When deciding whether or not to regulate a chemical, regulatory bodies often evaluate the degree to which the public may be exposed by evaluating the chemical’s occurrence in food and drinking water. As part of its decision-making process, the United States Environmental Protection Agency (USEPA) evaluated the occurrence of perchlorate in public drinking water by sampling public water systems (PWSs) as part of the first implementation of the Unregulated Contaminant Monitoring Rule (UCMR 1) between 2001 and 2005. The objective of this paper is to evaluate the current representativeness of the UCMR 1 dataset. To achieve this objective, publicly available sources were searched to obtain updated perchlorate data for the majority of large PWSs with perchlorate detections under UCMR 1. Comparison of the updated and UCMR 1 perchlorate datasets shows that the UCMR 1 dataset is no longer representative because the extent and degree of occurrence has decreased since implementation of UCMR 1. Given this finding, it seems appropriate for regulatory bodies engaged in decision-making processes over several years to periodically re-evaluate the conditions that prompted the regulatory effort, thereby ensuring that rules and regulations address actual conditions of concern.

Finished Water Storage and Quality Concerns

By Kelly A. Reynolds, MSPH, PhD

The municipal drinking-water distribution system is a complex delivery network designed to provide adequate potable water needs to entire communities. Much information has been published relative to concerns of the distribution system integrity and ability to provide safe, consistent water to consumers. Needs for infrastructure improvements, rapid response to main breaks and leaks, biofilm control and preventing intrusion events, dead legs and pressure losses, are just some of the prevalent water quality delivery issues. Less common are discussions around safe water storage prior to delivery. Although industry standards and guidelines exist, maintaining water quality over prolonged storage presents additional challenges and uncertainties for end users.

Comparative case study of legislative attempts to require private well testing in New Jersey and Maine

Flanagan, Sara V.; Zheng, Yan

ENVIRONMENTAL SCIENCE & POLICY, 85 40-46; SI 10.1016/j.envsci.2018.03.022 JUL 2018

Abstract: At present one of the greatest barriers to reducing exposure to naturally occurring arsenic from unregulated private well water is a lack of well testing. The New Jersey Private Well Testing Act (PWTA) has since 2002 required testing during real estate transactions. Due to limitations in relying on individual well owners to take protective actions, such state-wide testing regulations have been shown to make a significant contribution towards exposure reduction. This study examines the New Jersey PWTA as a case of testing requirements successfully adopted into law, and failed attempts to pass equivalent requirements in Maine for comparison. Although New Jersey’s long history of drinking water quality problems due to population density, an industrial past, and vulnerable aquifers was the root of the PWTA and earlier local testing ordinances, several high-profile events immediately prior focused public and legislator attention and mobilized environmental advocacy groups to gain political support statewide. Viewed through Kingdon’s Multiple Streams framework, the PWTA was the result of problem, policy, and politics streams successfully aligned during a significant and unique political window of opportunity. In Maine, where naturally occurring arsenic, not industrial contamination, is the primary concern, private sector opposition and a conservative administration resistant to government involvement in “private” well water, all played a role in blocking legislative attempts to require testing. A modest education and outreach bill without testing mandates passed in 2017 after compromise among stakeholders. For policy to be an effective tool to achieve universal well water screening, a philosophical evolution on the role of government in private water may be necessary.

Occurrence of Legionella spp. in Water-Main Biofilms from Two Drinking Water Distribution Systems

Waak, Michael B.; LaPara, Timothy M.; Halle, Cynthia; Hozalski, Raymond M.

ENVIRONMENTAL SCIENCE & TECHNOLOGY, 52 (14):7630-7639; 10.1021/acs.est.8b01170 JUL 17 2018

Abstract: The maintenance of a chlorine or chloramine residual to suppress waterborne pathogens in drinking water distribution systems is common practice in the United States but less common in Europe. In this study, we investigated the occurrence of Bacteria and Legionella spp. in water-main biofilms and tap water from a chloraminated distribution system in the United States and a system in Norway with no residual using real-time quantitative polymerase chain reaction (qPCR). Despite generally higher temperatures and assimilable organic carbon levels in the chloraminated system, total Bacteria and Legionella spp. were significantly lower in watermain biofilms and tap water of that system (p < 0.05). Legionella spp. were not detected in the biofilms of the chloraminated system (0 of 35 samples) but were frequently detected in biofilms from the no-residual system (10 of 23 samples; maximum concentration = 7.8 x 10(4) gene copies cm(-2)). This investigation suggests water-main biofilms may serve as a source of Legionella for tap water and premise plumbing systems, and residual chloramine may aid in reducing their abundance.


Detection of Pathogenic and Non-pathogenic Bacteria in Drinking Water and Associated Biofilms on the Crow Reservation, Montana, USA

Richards, Crystal L.; Broadaway, Susan C.; Eggers, Margaret J.; Doyle, John; Pyle, Barry H.; Camper, Anne K.; Ford, Timothy E.

MICROBIAL ECOLOGY, 76 (1):52-63; SI 10.1007/s00248-015-0595-6 JUL 2018

Abstract: Private residences in rural areas with water systems that are not adequately regulated, monitored, and updated could have drinking water that poses a health risk. To investigate water quality on the Crow Reservation in Montana, water and biofilm samples were collected from 57 public buildings and private residences served by either treated municipal or individual groundwater well systems. Bacteriological quality was assessed including detection of fecal coliform bacteria and heterotrophic plate count (HPC) as well as three potentially pathogenic bacterial genera, Mycobacterium, Legionella, and Helicobacter. All three target genera were detected in drinking water systems on the Crow Reservation.Species detected included the opportunistic and frank pathogens Mycobacterium avium, Mycobacterium gordonae, Mycobacterium flavescens, Legionella pneumophila, and Helicobacter pylori. Additionally, there was an association between HPC bacteria and the presence of Mycobacterium and Legionella but not the presence of Helicobacter. This research has shown that groundwater and municipal drinking water systems on the Crow Reservation can harbor potential bacterial pathogens.


The Case for Universal Screening of Private Well Water Quality in the U.S. and Testing Requirements to Achieve It: Evidence from Arsenic

Yan Zheng; Flanagan, S. V.

Environmental Health Perspectives, 125 (8):085002; 10.1289/EHP629 2017


BACKGROUND:The 1974 Safe Drinking Water Act (SDWA) regulates >170,000 public water systems to protect health, but not >13 million private wells. State and local government requirements for private well water testing are rare and inconsistent; the responsibility to ensure water safety remains with individual households. Over the last two decades, geogenic arsenic has emerged as a significant public health concern due to high prevalence in many rural American communities.

OBJECTIVES:We build the case for universal screening of private well water quality around arsenic, the most toxic and widespread of common private water contaminants. We argue that achieving universal screening will require policy intervention, and that testing should be made easy, accessible, and in many cases free to all private well households in the United States, considering the invisible, tasteless, odorless, and thus silent nature of arsenic.

DISCUSSION:Our research has identified behavioral, situational and financial barriers to households managing their own well water safety, resulting in far from universal screening despite traditional public health outreach efforts. We observe significant socioeconomic disparities in arsenic testing and treatment when private water is unregulated. Testing requirements can be a partial answer to these challenges.

CONCLUSIONS:Universal screening, achieved through local testing requirements complemented by greater community engagement targeting biologically and socioeconomically vulnerable groups, would reduce population arsenic exposure greater than any promotional efforts to date. Universal screening of private well water will identify the dangers hidden in America’s drinking water supply and redirect attention to ensure safe water among affected households.

Drinking water microbiome assembly induced by water stagnation

Ling, Fangqiong; Whitaker, Rachel; LeChevallier, Mark W.; Liu, Wen-Tso

ISME JOURNAL, 12 (6):1520-1531; 10.1038/S41396-018-0101-5 JUN 2018


What happens to tap water when you are away from home? Day-to-day water stagnation in building plumbing can potentially result in water quality deterioration (e.g., lead release or pathogen proliferation), which is a major public health concern. However, little is known about the microbial ecosystem processes in plumbing systems, hindering the development of biological monitoring strategies. Here, we track tap water microbiome assembly in situ, showing that bacterial community composition changes rapidly from the city supply following ~6-day stagnation, along with an increase in cell count from 103 cells/mL to upwards of 7.8 × 105 cells/mL. Remarkably, bacterial community assembly was highly reproducible in this built environment system (median Spearman correlation between temporal replicates = 0.78). Using an island biogeography model, we show that neutral processes arising from the microbial communities in the city water supply (i.e., migration and demographic stochasticity) explained the island community composition in proximal pipes (Goodness-of-fit = 0.48), yet declined as water approached the faucet (Goodness-of-fit = 0.21). We developed a size-effect model to simulate this process, which indicated that pipe diameter drove these changes by mediating the kinetics of hypochlorite decay and cell detachment, affecting selection, migration, and demographic stochasticity. Our study challenges current water quality monitoring practice worldwide which ignore biological growth in plumbing, and suggests the island biogeography model as a useful framework to evaluate building water system quality.

Economic Assessment of Waterborne Outbreak of Cryptosporidiosis

Chyzheuskaya, A.; Cormican, M.; Raghavendra Srivinas; O’Donovan, D.; Prendergast, M.; O’Donoghue, C.; Morris, D.

Emerging Infectious Diseases, 23 (10):1650-1656; 10.3201/eid2310.1520372017


In 2007, a waterborne outbreak of Cryptosporidium hominis infection occurred in western Ireland, resulting in 242 laboratory-confirmed cases and an uncertain number of unconfirmed cases. A boil water notice was in place for 158 days that affected 120,432 persons residing in the area, businesses, visitors, and commuters. This outbreak represented the largest outbreak of cryptosporidiosis in Ireland. The purpose of this study was to evaluate the cost of this outbreak. We adopted a societal perspective in estimating costs associated with the outbreak. Economic cost estimated was based on totaling direct and indirect costs incurred by public and private agencies. The cost of the outbreak was estimated based on 2007 figures. We estimate that the cost of the outbreak was >€19 million (≈€120,000/day of the outbreak). The US dollar equivalent based on today’s exchange rates would be $22.44 million (≈$142,000/day of the outbreak). This study highlights the economic need for a safe drinking water supply.

Distribution System Operational Deficiencies Coincide with Reported Legionnaires’ Disease Clusters in Flint, Michigan

Rhoads, William J.; Garner, Emily; Ji, Pan; Zhu, Ni; Parks, Jeffrey; Schwake, David Otto; Pruden, Amy; Edwards, Marc A.

ENVIRONMENTAL SCIENCE & TECHNOLOGY, 51 (20):11986-11995; 10.1021/acs.est.7b01589OCT 17 2017

Abstract: We hypothesize that the increase in reported Legionnaires’ disease from June 2014 to November 2015 in Genesee County, MI (where Flint is located) was directly linked to the switch to corrosive Flint River water from noncorrosive Detroit water from April 2014 to October 2015. To address the lack of epidemiological data linking the drinking water supplies to disease incidence, we gathered physiochemical and biological water quality data from 2010 to 2016 to evaluate characteristics of the Flint River water that were potentially conducive to Legionella growth. The treated Flint River water was 8.6 times more corrosive than Detroit water in short-term testing, releasing more iron, which is a key Legionella nutrient, while also directly causing disinfectant to decay more rapidly. The Flint River water source was also 0.8–6.7 °C warmer in summer months than Detroit water and exceeded the minimum Legionella growth temperature of 20 °C more frequently (average number of days per year for Detroit was 63 versus that for the Flint River, which was 157). The corrosive water also led to 1.3–2.2 times more water main breaks in 2014–2015 compared to 2010–2013; such disruptions have been associated with outbreaks in other locales. Importantly, Legionella spp. and Legionella pneumophila decreased after switching back to Detroit water, in terms of both gene markers and culturability, when August and October 2015 were compared to November 2016.

Assessing the origin of bacteria in tap water and distribution system in an unchlorinated drinking water system by SourceTracker using microbial community fingerprints

Gang Liu; Ya Zhang; Mark, E. van der; Magic-Knezev, A.; Pinto, A.; Bogert, B. van den; Wentso Liu; Meer, W. van der; Medema, G

Water Research, 138 86-96; 10.1016/j.watres.2018.03.0432018


The general consensus is that the abundance of tap water bacteria is greatly influenced by water purification and distribution. Those bacteria that are released from biofilm in the distribution system are especially considered as the major potential risk for drinking water bio-safety. For the first time, this full-scale study has captured and identified the proportional contribution of the source water, treated water, and distribution system in shaping the tap water bacterial community based on their microbial community fingerprints using the Bayesian “SourceTracker” method. The bacterial community profiles and diversity analyses illustrated that the water purification process shaped the community of planktonic and suspended particle-associated bacteria in treated water. The bacterial communities associated with suspended particles, loose deposits, and biofilm were similar to each other, while the community of tap water planktonic bacteria varied across different locations in distribution system. The microbial source tracking results showed that there was not a detectable contribution of source water to bacterial community in the tap water and distribution system. The planktonic bacteria in the treated water was the major contributor to planktonic bacteria in the tap water (17.7–54.1%). The particle-associated bacterial community in the treated water seeded the bacterial community associated with loose deposits (24.9–32.7%) and biofilm (37.8–43.8%) in the distribution system. In return, the loose deposits and biofilm showed a significant influence on tap water planktonic and particle-associated bacteria, which were location dependent and influenced by hydraulic changes. This was revealed by the increased contribution of loose deposits to tap water planktonic bacteria (from 2.5% to 38.0%) and an increased contribution of biofilm to tap water particle-associated bacteria (from 5.9% to 19.7%) caused by possible hydraulic disturbance from proximal to distal regions. Therefore, our findings indicate that the tap water bacteria could possibly be managed by selecting and operating the purification process properly and cleaning the distribution system effectively.

Enrichment of free-living amoebae in biofilms developed at upper water levels in drinking water storage towers: An inter- and intra-seasonal study

Taravaud, Alexandre; Ali, Myriam; Lafosse, Bernard; Nicolas, Valerie; Feliers, Cedric; Thibert, Sylvie; Levi, Yves; Loiseau, Philippe M.; Pomel, Sebastien

SCIENCE OF THE TOTAL ENVIRONMENT, 633 157-166; 10.1016/j.scitotenv.2018.03.178AUG 15 2018


Free-living amoebae (FLA) are ubiquitous organisms present in various natural and artificial environments, such as drinking water storage towers (DWST). Some FLA, such as Acanthamoeba sp., Naegleria fowleri, and Balamuthia mandrillaris, can cause severe infections at ocular or cerebral level in addition to being potential reservoirs of other pathogens. In this work, the abundance and diversity of FLA was evaluated in two sampling campaigns: one performed over five seasons in three DWST at three different levels (surface, middle and bottom) in water and biofilm using microscopy and PCR, and one based on the kinetics analysis in phase contrast and confocal microscopy of biofilm samples collected every two weeks during a 3-month period at the surface and at the bottom of a DWST. In the seasonal study, the FLA were detected in each DWST water in densities of ~20 to 25 amoebae L−1. A seasonal variation of amoeba distribution was observed in water samples, with maximal densities in summer at ~30 amoebae L−1 and minimal densities in winter at ~16 amoebae L−1. The FLA belonging to the genus Acanthamoeba were detected in two spring sampling campaigns, suggesting a possible seasonal appearance of this potentially pathogenic amoeba. Interestingly, a 1 log increase of amoebae density was observed in biofilm samples collected at the surface of all DWST compared to the middle and the bottom where FLA were at 0.1–0.2 amoebae/cm2. In the kinetics study, an increase of amoebae density, total cell density, and biofilm thickness was observed as a function of time at the surface of the DWST, but not at the bottom. To our knowledge, this study describes for the first time a marked higher FLA density in biofilms collected at upper water levels in DWST, constituting a potential source of pathogenic micro-organisms.

A Systematic Review of the Time Series Studies Addressing the Endemic Risk of Acute Gastroenteritis According to Drinking Water Operation Conditions in Urban Areas of Developed Countries

Pascal Beaudeau
Santé Publique France, 14 rue du Val-d’Osne, 94415 Saint-Maurice CEDEX, France
Received: 28 February 2018 / Revised: 20 April 2018 / Accepted: 24 April 2018 / Published: 26 April 2018


Time series studies (TSS) can be viewed as an inexpensive way to tackle the non-epidemic health risk from fecal pathogens in tap water in urban areas. Following the PRISMA recommendations, I reviewed TSS addressing the endemic risk of acute gastroenteritis risk according to drinking water operation conditions in urban areas of developed countries. Eighteen studies were included, covering 17 urban sites (seven in North-America and 10 in Europe) with study populations ranging from 50,000 to 9 million people. Most studies used general practitioner consultations or visits to hospitals for acute gastroenteritis (AGE) as health outcomes. In 11 of the 17 sites, a significant and plausible association was found between turbidity (or particle count) in finished water and the AGE indicator. When provided and significant, the interquartile excess of relative risk estimates ranged from 3–13%. When examined, water temperature, river flow, and produced flow were strongly associated with the AGE indicator. The potential of TSS for the study of the health risk from fecal pathogens in tap water is limited by the lack of specificity of turbidity and its site-sensitive value as an exposure proxy. Nevertheless, at the DWS level, TSS could help water operators to identify operational conditions most at risk, almost if considering other water operation indicators, in addition to turbidity, as possible relevant proxies for exposure. View Full-Text

Do estrogenic compounds in drinking water migrating from plastic pipe distribution system pose adverse effects to human? An analysis of scientific literature

Liu, Z., et al., Environmental Science and Pollution Research, 24(2):2126-2134, January 2017

With the widespread application of plastic pipes in drinking water distribution system, the effects of various leachable organic chemicals have been investigated and their occurrence in drinking water supplies is monitored. Most studies focus on the odor problems these substances may cause. This study investigates the potential endocrine disrupting effects of the migrating compound 2,4-di-tert-butylphenol (2,4-d-t-BP). The summarized results show that the migration of 2,4-d-t-BP from plastic pipes could result in chronic exposure and the migration levels varied greatly among different plastic pipe materials and manufacturing brands. Based on estrogen equivalent (EEQ), the migrating levels of the leachable compound 2,4-d-t-BP in most plastic pipes were relative low. However, the EEQ levels in drinking water migrating from four out of 15 pipes may pose significant adverse effects. With the increasingly strict requirements on regulation of drinking water quality, these results indicate that some drinking water transported with plastic pipes may not be safe for human consumption due to the occurrence of 2,4-d-t-BP. Moreover, 2,4-d-t-BP is not the only plastic pipe-migrating estrogenic compound, other compounds such as 2-tert-butylphenol (2-t-BP), 4-tert-butylphenol (4-t-BP), and others may also be leachable from plastic pipes.

To Buy or not to Buy? Perceptions of Bottled Drinking Water in Australia and New Zealand

Ragusa, A.T., and Crampton, A., Human Ecology, 44(5):565-576, October 2016

In the midst of popular and scientific debates about its desirability, safety and environmental sustainability, bottled water is forecast to become the most consumed packaged beverage globally (Feliciano 2014) and fastest growth sector in Australia (Johnson 2007). Manufacturers attribute increasing sales to convenience and health benefits rather than intensive advertising/marketing campaigns. Our sociological investigation of drinking water perceptions generally, and bottled water specifically, using data from 192 face-to-face interviews with Australians and New Zealanders, revealed 77 % thought about the quality of their drinking water; 64 % noted specific adverse issues, and 82 % reported concerns with their tap water. However, although 64 % drink bottled water, just 28 % believe it is better than tap water and 63 % consider it a waste of money. Only 21 % drink it for ‘convenience’ and consumption patterns vary significantly by gender, with men and younger generations purchasing the most bottled water. Qualitative analysis refutes stereotypes associating bottled water with a status symbol or lifestyle choice; participants largely mistrust water companies; just 13 % describe bottled water as a ‘trusted’ product, even when consumed for its taste or convenience, and 13 % label it a ‘bad’ plastic product detrimental to the environment or public health, thus lending support for institutional and policy trends banning bottled water.

Characterization of a Drinking Water Distribution Pipeline Terminally Colonized by Naegleria fowleri

Morgan, M.J.,, Environment & Technology, 50(6):2890-2898, March 2016

Free-living amoebae, such as Naegleria fowleri, Acanthamoeba spp., and Vermamoeba spp., have been identified as organisms of concern due to their role as hosts for pathogenic bacteria and as agents of human disease. In particular, N. fowleri is known to cause the disease primary amoebic meningoencephalitis (PAM) and can be found in drinking water systems in many countries. Understanding the temporal dynamics in relation to environmental and biological factors is vital for developing management tools for mitigating the risks of PAM. Characterizing drinking water systems in Western Australia with a combination of physical, chemical and biological measurements over the course of a year showed a close association of N. fowleri with free chlorine and distance from treatment over the course of a year. This information can be used to help design optimal management strategies for the control of N. fowleri in drinking-water-distribution systems.

EHP – Use of a Cumulative Exposure Index to Estimate the Impact of Tap Water Lead Concentration on Blood Lead Levels in 1- to 5-Year-Old Children (Montréal, Canada)

Ngueta, G., Environmental Health Perspective, March 2016

Drinking water is recognized as a source of lead (Pb) exposure. However, questions remain about the impact of chronic exposure to lead-contaminated water on internal dose. Our goal was to estimate the relation between a cumulative water Pb exposure index (CWLEI) and blood Pb levels (BPb) in children 1–5 years of ages. Between 10 September 2009 and 27 March 2010, individual characteristics and water consumption data were obtained from 298 children. Venous blood samples were collected (one per child) and a total of five 1-L samples of water per home were drawn from the kitchen tap. A second round of water collection was performed between 22 June 2011 and 6 September 2011 on a subsample of houses. Pb analyses used inductively coupled plasma mass spectroscopy. Multiple linear regressions were used to estimate the association between CWLEI and BPb. Each 1-unit increase in CWLEI multiplies the expected value of BPb by 1.10 (95% CI: 1.06, 1.15) after adjustment for confounders. Mean BPb was significantly higher in children in the upper third and fourth quartiles of CWLEI (0.7–1.9 and ≥ 1.9 μg/kg of body weight) compared with the first (< 0.2 μg/kg) after adjusting for confounders (19%; 95% CI: 0, 42% and 39%; 95% CI: 15, 67%, respectively). The trends analysis yielded a p-value < 0.0001 after adjusting for confounders suggesting a dose–response relationship between percentiles of CWLEI and BPb. In children 1–5 years of age, BPb was significantly associated with water lead concentration with an increase starting at a cumulative lead exposure of ≥ 0.7 μg Pb/kg of body weight. In this age group, an increase of 1 μg/L in water lead would result in an increase of 35% of BPb after 150 days of exposure.


Rapid bacteriophage MS2 transport in an oxic sandy aquifer in cold climate: Field experiments and modeling

Kvitsand, H.M.L., Water Resources Research, 51:9725-9745, December 2015

Virus removal during rapid transport in an unconfined, low-temperature (6°C) sand and gravel aquifer was investigated at a riverbank field site, 25 km south of Trondheim in central Norway. The data from bacteriophage MS2 inactivation and transport experiments were applied in a two-site kinetic transport model using HYDRUS-1D, to evaluate the mechanisms of virus removal and whether these mechanisms were sufficient to protect the groundwater supplies. The results demonstrated that inactivation was negligible to the overall removal and that irreversible MS2 attachment to aquifer grains, coated with iron precipitates, played a dominant role in the removal of MS2; 4.1 log units of MS2 were removed by attachment during 38 m travel distance and less than 2 days residence time. Although the total removal was high, pathways capable of allowing virus migration at rapid velocities were present in the aquifer. The risk of rapid transport of viable viruses should be recognized, particularly for water supplies without permanent disinfection.


Extent and Impacts of Unplanned Wastewater Reuse in US Rivers

Rice, J.,, American Water Works Association Journal, 107.11:571-581, November 2015

A recently developed watershed-scale hydraulic model (De-facto Reuse Incidence in our Nation’s Consumptive Supply [DRINCS]) was applied to estimate municipal wastewater treatment plant (WWTP) contribution to downstream water treatment plant (WTP) influent flow. Using DRINCS and geocoded data for 14,651 WWTPs and 1,320 WTPs, the occurrence of treated municipal wastewater in drinking water supplies is geographically widespread, and its magnitude depends largely on the flow condition and size of the source river. Under average streamflow conditions in this study, the median contribution of wastewater flow to drinking water supplies was approximately 1% and increased to as much as 100% under low-flow conditions (modeled by Q95). Wastewater contributions to nutrient and emerging contaminant loading were estimated and geospatially compared with the findings of the US Environmental Protection Agency’s Unregulated Contaminant Monitoring Rule and Long Term 2 Enhanced Surface Water Treatment Rule. In turn, this analysis offers important insights into the treatment challenges facing treatment facilities across the United States.

Fountain Autopsy to Determine Lead Occurrence in Drinking Water: Journal of Environmental Engineering: (ASCE)

McIlwain, B., et al., Journal of Environmental Engineering, November 2015

Exposure to lead in drinking water poses a risk for various adverse health effects, and significant efforts have been made to monitor and eliminate lead exposure in drinking water. This study focused on the localization of lead exposure from 71 drinking water fountains in nonresidential buildings in order to determine the source of elevated lead and understand the effects of fountains associated with lead concentration in drinking water. Drinking water fountains containing lead-lined cooling tanks and brass fittings were found to release lead concentrations in excess of 10 μg/L10 μg/L, and fountains with low or infrequent usage and those with cooling tanks produced the highest concentrations (in excess of 20  μg/L20  μg/L) of lead. One particular fountain model found at several locations throughout the institution was associated with some of the highest lead concentrations measured throughout the study. This fountain was recalled in the United States, but not in Canada. This article adds to existing research demonstrating that drinking water fountains are a potentially significant and underappreciated source of lead exposure in nonresidential buildings.

Investigation of Cost and Energy Optimization of Drinking Water Distribution Systems

Cherci, C.,, Environmental Science & Technology, 49.22:13724-13732, October 2015

Holistic management of water and energy resources through energy and water quality management systems (EWQMSs) have traditionally aimed at energy cost reduction with limited or no emphasis on energy efficiency or greenhouse gas minimization. This study expanded the existing EWQMS framework and determined the impact of different management strategies for energy cost and energy consumption (e.g., carbon footprint) reduction on system performance at two drinking water utilities in California (United States). The results showed that optimizing for cost led to cost reductions of 4% (Utility B, summer) to 48% (Utility A, winter). The energy optimization strategy was successfully able to find the lowest energy use operation and achieved energy usage reductions of 3% (Utility B, summer) to 10% (Utility A, winter). The findings of this study revealed that there may be a trade-off between cost optimization (dollars) and energy use (kilowatt-hours), particularly in the summer, when optimizing the system for the reduction of energy use to a minimum incurred cost increases of 64% and 184% compared with the cost optimization scenario. Water age simulations through hydraulic modeling did not reveal any adverse effects on the water quality in the distribution system or in tanks from pump schedule optimization targeting either cost or energy minimization.

PLOS ONE: A Systematic Review of Waterborne Disease Outbreaks Associated with Small Non-Community Drinking Water Systems in Canada and the United States

Pons, W., et al., PLoS ONE, October 2015

Reports of outbreaks in Canada and the United States (U.S.) indicate that approximately 50% of all waterborne diseases occur in small non-community drinking water systems (SDWSs). Summarizing these investigations to identify the factors and conditions contributing to outbreaks is needed in order to help prevent future outbreaks. The objectives of this study were to: 1) identify published reports of waterborne disease outbreaks involving SDWSs in Canada and the U.S. since 1970; 2) summarize reported factors contributing to outbreaks, including water system characteristics and events surrounding the outbreaks; and 3) identify terminology used to describe SDWSs in outbreak reports. Three electronic databases and grey literature sources were searched for outbreak reports involving SDWSs throughout Canada and the U.S. from 1970 to 2014. Two reviewers independently screened and extracted data related to water system characteristics and outbreak events. The data were analyzed descriptively with ‘outbreak’ as the unit of analysis. From a total of 1,995 citations, we identified 50 relevant articles reporting 293 unique outbreaks. Failure of an existing water treatment system (22.7%) and lack of water treatment (20.2%) were the leading causes of waterborne outbreaks in SDWSs. A seasonal trend was observed with 51% of outbreaks occurring in summer months (p<0.001). There was large variation in terminology used to describe SDWSs, and a large number of variables were not reported, including water source and whether water treatment was used (missing in 31% and 66% of reports, respectively). More consistent reporting and descriptions of SDWSs in future outbreak reports are needed to understand the epidemiology of these outbreaks and to inform the development of targeted interventions for SDWSs. Additional monitoring of water systems that are used on a seasonal or infrequent basis would be worthwhile to inform future protection efforts.

An evaluation of the readability of drinking water quality reports: a national assessment

Siddhartha, R.,, Journal of Water and Health, 13.3:645-653, September 2015

The United States Environmental Protection Agency mandates that community water systems (or water utilities) provide annual consumer confidence reports (CCRs)–water quality reports–to their consumers. These reports encapsulate information regarding sources of water, detected contaminants, regulatory compliance, and educational material. These reports have excellent potential for providing the public with accurate information on the safety of tap water, but there is a lack of research on the degree to which the information can be understood by a large proportion of the population. This study evaluated the readability of a nationally representative sample of 30 CCRs, released between 2011 and 2013. Readability (or ‘comprehension difficulty’) was evaluated using Flesch-Kincaid readability tests. The analysis revealed that CCRs were written at the 11th-14th grade level, which is well above the recommended 6th-7th grade level for public health communications. The CCR readability ease was found to be equivalent to that of the Harvard Law Review journal. These findings expose a wide chasm that exists between current water quality reports and their effectiveness toward being understandable to US residents. Suggestions for reorienting language and scientific information in CCRs to be easily comprehensible to the public are offered.

National Cost Implications of Potential Long-Term LCR Requirement

Slabaugh, R.M.,, American Water Works Association Journal, 107.8:389-400, August 2015

Concerns that the current Lead and Copper Rule (LCR) may not adequately protect public health have prompted the US Environmental Protection Agency (USEPA) to consider restructuring existing monitoring requirements by targeting a redefined pool of high-risk sites or altering the sampling protocol. Analysis of historical lead and copper monitoring data from 18 public water systems (PWSs) verified that a significant percentage of PWSs with lead service lines are likely to be affected by potential Long-Term Lead and Copper Rule (LT-LCR) revisions. Data were used to facilitate a national cost-of-compliance estimate for additional implementation of corrosion control treatment (CCT) necessary to comply with the LT-LCR and potential unintended consequences associated with those treatment changes. Cost estimates presented here can be used by USEPA to shape the upcoming rule and also by PWSs to assess potential costs associated with optimizing CCT for LT-LCR compliance.

Water Distribution System Deficiencies and Gastrointestinal Illness: A Systematic Review and Meta-Analysis

Ercumen, A.,, Environmental Health Perspectives, 122.7:651-660, July 2014

Water distribution systems are vulnerable to performance deficiencies that can cause (re)contamination of treated water and plausibly lead to increased risk of gastrointestinal illness (GII) in consumers. It is well established that large system disruptions in piped water networks can cause GII outbreaks. We hypothesized that routine network problems can also contribute to background levels of waterborne illness and conducted a systematic review and meta-analysis to assess the impact of distribution system deficiencies on endemic GII. We reviewed published studies that compared direct tap water consumption to consumption of tap water re-treated at the point of use (POU) and studies of specific system deficiencies such as breach of physical or hydraulic pipe integrity and lack of disinfectant residual. In settings with network malfunction, consumers of tap water versus POU-treated water had increased GII [incidence density ratio (IDR) = 1.34; 95% CI: 1.00, 1.79]. The subset of nonblinded studies showed a significant association between GII and tap water versus POU-treated water consumption (IDR = 1.52; 95% CI: 1.05, 2.20), but there was no association based on studies that blinded participants to their POU water treatment status (IDR = 0.98; 95% CI: 0.90, 1.08). Among studies focusing on specific network deficiencies, GII was associated with temporary water outages (relative risk = 3.26; 95% CI: 1.48, 7.19) as well as chronic outages in intermittently operated distribution systems (odds ratio = 1.61; 95% CI: 1.26, 2.07). It was concluded that tap water consumption is associated with GII in malfunctioning distribution networks. System deficiencies such as water outages also are associated with increased GII, suggesting a potential health risk for consumers served by piped water networks

Leaching of bisphenol A and F from new and old epoxy coatings: laboratory and field studies

Bruchet, A.,, Water and Science Technology:Water Supply, 14.3:383-389, June 2014

Laboratory tests were carried out with three types of new epoxy resins to assess the release of bisphenol A and F (BPA and BPF) and potential halogenated phenolic by-products. Tests were carried out over a duration of 6 months in the presence and absence of disinfectants (chlorine and chlorine dioxide) at realistic doses and contact times. None of the three systems exhibited Fickian-type diffusion for BPA. Leaching was quite low for two epoxies while the third showed a trend of increasing leaching during the first 5 months of immersion. BPA was only observed in the absence of disinfectant while no BPF was observed under any condition. 2,4,6-trichlorophenol (TCP), a BPA chlorination by-product was sporadically observed in the chlorinated water during the first months of contact. Following discontinuation of the disinfectants, its release was significantly enhanced in the water having been exposed to chlorinated water. Laboratory leaching tests also indicated rapid oxidation of epoxies by chlorine and chlorine dioxide. Analysis of 27 epoxy-coated drinking water storage tanks did not reveal any BPA, BPF or TCP. On the other hand, a large-scale examination of about 200 pipe sections rehabilitated with epoxies during the 1990s led to a high frequency of BPA and BPF detection, sometimes with maximum values around 1 μg/L.

Drinking water biofilm cohesiveness changes under chlorination or hydrodynamic stress

Mathieu, L., et al., Water Research, May 2014

Attempts at removal of drinking water biofilms rely on various preventive and curative strategies such as nutrient reduction in drinking water, disinfection or water flushing, which have demonstrated limited efficiency. The main reason for these failures is the cohesiveness of the biofilm driven by the physico-chemical properties of its exopolymeric matrix (EPS). Effective cleaning procedures should break up the matrix and/or change the elastic properties of bacterial biofilms. The aim of this study was to evaluate the change in the cohesive strength of two-month-old drinking water biofilms under increasing hydrodynamic shear stress τw (from ∼0.2 to ∼10 Pa) and shock chlorination (applied concentration at T0: 10 mg Cl2/L; 60 min contact time). Biofilm erosion (cell loss per unit surface area) and cohesiveness (changes in the detachment shear stress and cluster volumes measured by atomic force microscopy (AFM)) were studied. When rapidly increasing the hydrodynamic constraint, biofilm removal was found to be dependent on a dual process of erosion and coalescence of the biofilm clusters. Indeed, 56% of the biofilm cells were removed with, concomitantly, a decrease in the number of the 50–300 μm3 clusters and an increase in the number of the smaller (i.e., 600 μm3) ones. Moreover, AFM evidenced the strengthening of the biofilm structure along with the doubling of the number of contact points, NC, per cluster volume unit following the hydrodynamic disturbance. This suggests that the compactness of the biofilm exopolymers increases with hydrodynamic stress. Shock chlorination removed cells (−75%) from the biofilm while reducing the volume of biofilm clusters. Oxidation stress resulted in a decrease in the cohesive strength profile of the remaining drinking water biofilms linked to a reduction in the number of contact points within the biofilm network structure in particular for the largest biofilm cluster volumes (>200 μm3). Changes in the cohesive strength of drinking water biofilms subsequent to cleaning/disinfection operations call into question the effectiveness of cleaning-in-place procedures. The combined alternating use of oxidation and shear stress sequences needs to be investigated as it could be an important adjunct to improving biofilm removal/reduction procedures.


Opportunistic pathogens in roof-captured rainwater samples, determined using quantitative PCR. – PubMed – NCBI

Ahmed, W., Water Research, April 2014

In this study, quantitative PCR (qPCR) was used for the detection of four opportunistic bacterial pathogens in water samples collected from 72 rainwater tanks in Southeast Queensland, Australia. Tank water samples were also tested for fecal indicator bacteria (Escherichia coli and Enterococcus spp.) using culture-based methods. Among the 72 tank water samples tested, 74% and 94% samples contained E. coli and Enterococcus spp., respectively, and the numbers of E. coli and Enterococcus spp. in tank water samples ranged from 0.3 to 3.7 log₁₀ colony forming units (CFU) per 100 mL of water. In all, 29%, 15%, 13%, and 6% of tank water samples contained Aeromonas hydrophila, Staphylococcus aureus, Pseudomonas aeruginosa and Legionella pneumophila, respectively. The genomic units (GU) of opportunistic pathogens in tank water samples ranged from 1.5 to 4.6 log₁₀ GU per 100 mL of water. A significant correlation was found between E. coli and Enterococcus spp. numbers in pooled tank water samples data (Spearman’s rs = 0.50; P < 0.001). In contrast, fecal indicator bacteria numbers did not correlate with the presence/absence of opportunistic pathogens tested in this study. Based on the results of this study, it would be prudent, to undertake a Quantitative Microbial Risk Assessment (QMRA) analysis of opportunistic pathogens to determine associated health risks for potable and nonpotable uses of tank water.


Clues to the Future of the Park Doctrine

Burroughs, A.D., and Rin, D., Food and Drug Law Institute, November/December 2012

This article examines three recent cases brought under the controversial Park doctrine in search of clues to the doctrine’s future. The responsible corporate officer (RCO) doctrine, also known as the Park doctrine, allows for criminal prosecution of individuals, typically high-ranking corporate executives of pharmaceutical companies, for violations of the Food, Drug and Cosmetic Act (FDCA), even absent any proof of the individual defendant’s knowledge of or participation in the violation. It is relevant to drinking water because the Park law applies to bottled water, but not to tap water.

Lead (Pb) in Tap Water and in Blood: Implications for Lead Exposure in the United States (PDF Download Available)

Triantafyllidou, S. and Edwards, M., Critical Reviews in Environmental Science and Technology, June 2012

Lead is widely recognized as one of the most pervasive environmental health threats in the United States, and there is increased concern over adverse health impacts at levels of exposure once considered safe. Lead contamination of tap water was once a major cause of lead exposure in the United States and, as other sources have been addressed, the relative contribution of lead in water to lead in blood is expected to become increasingly important. Moreover, prior research suggests that lead in water may be more important as a source than is presently believed. The authors describe sources of lead in tap water, chemical forms of the lead, and relevant U.S. regulations/guidelines, while considering their implications for human exposure. Research that examined associations between water lead levels and blood lead levels is critically reviewed, and some of the challenges in making such associations, even if lead in water is the dominant source of lead in blood, are highlighted. Better protecting populations at risk from this and from other lead sources is necessary, if the United States is to achieve its goal of eliminating elevated blood lead levels in children by 2020.

Water Quality Control in Premise Plumbing

Reynolds, K.A., Water Conditioning and Purification, February 2007

The quality of water at the end use is impacted by numerous and varied factors including source water type and quality, age of the distribution system, climatic events and even consumer use patterns. Therefore, providing high-quality drinking water at the tap requires a multi-barrier approach aimed at source water protection, source water treatment and reliable distribution. Each of these steps is monitored and controlled by municipal water treatment standards and guidelines; however, what happens to the water quality beyond the service connection at individual sites is not as well understood. New reports of water quality deterioration in the plumbing of residential or commercial buildings, known as premise plumbing, pose a question: Just what is present in our pipes?