Flanagan, Sara V.; Zheng, Yan
ENVIRONMENTAL SCIENCE & POLICY, 85 40-46; SI 10.1016/j.envsci.2018.03.022 JUL 2018
Abstract: At present one of the greatest barriers to reducing exposure to naturally occurring arsenic from unregulated private well water is a lack of well testing. The New Jersey Private Well Testing Act (PWTA) has since 2002 required testing during real estate transactions. Due to limitations in relying on individual well owners to take protective actions, such state-wide testing regulations have been shown to make a significant contribution towards exposure reduction. This study examines the New Jersey PWTA as a case of testing requirements successfully adopted into law, and failed attempts to pass equivalent requirements in Maine for comparison. Although New Jersey’s long history of drinking water quality problems due to population density, an industrial past, and vulnerable aquifers was the root of the PWTA and earlier local testing ordinances, several high-profile events immediately prior focused public and legislator attention and mobilized environmental advocacy groups to gain political support statewide. Viewed through Kingdon’s Multiple Streams framework, the PWTA was the result of problem, policy, and politics streams successfully aligned during a significant and unique political window of opportunity. In Maine, where naturally occurring arsenic, not industrial contamination, is the primary concern, private sector opposition and a conservative administration resistant to government involvement in “private” well water, all played a role in blocking legislative attempts to require testing. A modest education and outreach bill without testing mandates passed in 2017 after compromise among stakeholders. For policy to be an effective tool to achieve universal well water screening, a philosophical evolution on the role of government in private water may be necessary.
Waak, Michael B.; LaPara, Timothy M.; Halle, Cynthia; Hozalski, Raymond M.
ENVIRONMENTAL SCIENCE & TECHNOLOGY, 52 (14):7630-7639; 10.1021/acs.est.8b01170 JUL 17 2018
Abstract: The maintenance of a chlorine or chloramine residual to suppress waterborne pathogens in drinking water distribution systems is common practice in the United States but less common in Europe. In this study, we investigated the occurrence of Bacteria and Legionella spp. in water-main biofilms and tap water from a chloraminated distribution system in the United States and a system in Norway with no residual using real-time quantitative polymerase chain reaction (qPCR). Despite generally higher temperatures and assimilable organic carbon levels in the chloraminated system, total Bacteria and Legionella spp. were significantly lower in watermain biofilms and tap water of that system (p < 0.05). Legionella spp. were not detected in the biofilms of the chloraminated system (0 of 35 samples) but were frequently detected in biofilms from the no-residual system (10 of 23 samples; maximum concentration = 7.8 x 10(4) gene copies cm(-2)). This investigation suggests water-main biofilms may serve as a source of Legionella for tap water and premise plumbing systems, and residual chloramine may aid in reducing their abundance.
Richards, Crystal L.; Broadaway, Susan C.; Eggers, Margaret J.; Doyle, John; Pyle, Barry H.; Camper, Anne K.; Ford, Timothy E.
MICROBIAL ECOLOGY, 76 (1):52-63; SI 10.1007/s00248-015-0595-6 JUL 2018
Abstract: Private residences in rural areas with water systems that are not adequately regulated, monitored, and updated could have drinking water that poses a health risk. To investigate water quality on the Crow Reservation in Montana, water and biofilm samples were collected from 57 public buildings and private residences served by either treated municipal or individual groundwater well systems. Bacteriological quality was assessed including detection of fecal coliform bacteria and heterotrophic plate count (HPC) as well as three potentially pathogenic bacterial genera, Mycobacterium, Legionella, and Helicobacter. All three target genera were detected in drinking water systems on the Crow Reservation.Species detected included the opportunistic and frank pathogens Mycobacterium avium, Mycobacterium gordonae, Mycobacterium flavescens, Legionella pneumophila, and Helicobacter pylori. Additionally, there was an association between HPC bacteria and the presence of Mycobacterium and Legionella but not the presence of Helicobacter. This research has shown that groundwater and municipal drinking water systems on the Crow Reservation can harbor potential bacterial pathogens.
Yan Zheng; Flanagan, S. V.
Environmental Health Perspectives, 125 (8):085002; 10.1289/EHP629 2017
BACKGROUND:The 1974 Safe Drinking Water Act (SDWA) regulates >170,000 public water systems to protect health, but not >13 million private wells. State and local government requirements for private well water testing are rare and inconsistent; the responsibility to ensure water safety remains with individual households. Over the last two decades, geogenic arsenic has emerged as a significant public health concern due to high prevalence in many rural American communities.
OBJECTIVES:We build the case for universal screening of private well water quality around arsenic, the most toxic and widespread of common private water contaminants. We argue that achieving universal screening will require policy intervention, and that testing should be made easy, accessible, and in many cases free to all private well households in the United States, considering the invisible, tasteless, odorless, and thus silent nature of arsenic.
DISCUSSION:Our research has identified behavioral, situational and financial barriers to households managing their own well water safety, resulting in far from universal screening despite traditional public health outreach efforts. We observe significant socioeconomic disparities in arsenic testing and treatment when private water is unregulated. Testing requirements can be a partial answer to these challenges.
CONCLUSIONS:Universal screening, achieved through local testing requirements complemented by greater community engagement targeting biologically and socioeconomically vulnerable groups, would reduce population arsenic exposure greater than any promotional efforts to date. Universal screening of private well water will identify the dangers hidden in America’s drinking water supply and redirect attention to ensure safe water among affected households. https://doi.org/10.1289/EHP629
Ling, Fangqiong; Whitaker, Rachel; LeChevallier, Mark W.; Liu, Wen-Tso
ISME JOURNAL, 12 (6):1520-1531; 10.1038/S41396-018-0101-5 JUN 2018
What happens to tap water when you are away from home? Day-to-day water stagnation in building plumbing can potentially result in water quality deterioration (e.g., lead release or pathogen proliferation), which is a major public health concern. However, little is known about the microbial ecosystem processes in plumbing systems, hindering the development of biological monitoring strategies. Here, we track tap water microbiome assembly in situ, showing that bacterial community composition changes rapidly from the city supply following ~6-day stagnation, along with an increase in cell count from 103 cells/mL to upwards of 7.8 × 105 cells/mL. Remarkably, bacterial community assembly was highly reproducible in this built environment system (median Spearman correlation between temporal replicates = 0.78). Using an island biogeography model, we show that neutral processes arising from the microbial communities in the city water supply (i.e., migration and demographic stochasticity) explained the island community composition in proximal pipes (Goodness-of-fit = 0.48), yet declined as water approached the faucet (Goodness-of-fit = 0.21). We developed a size-effect model to simulate this process, which indicated that pipe diameter drove these changes by mediating the kinetics of hypochlorite decay and cell detachment, affecting selection, migration, and demographic stochasticity. Our study challenges current water quality monitoring practice worldwide which ignore biological growth in plumbing, and suggests the island biogeography model as a useful framework to evaluate building water system quality.
Chyzheuskaya, A.; Cormican, M.; Raghavendra Srivinas; O’Donovan, D.; Prendergast, M.; O’Donoghue, C.; Morris, D.
Emerging Infectious Diseases, 23 (10):1650-1656; 10.3201/eid2310.1520372017
In 2007, a waterborne outbreak of Cryptosporidium hominis infection occurred in western Ireland, resulting in 242 laboratory-confirmed cases and an uncertain number of unconfirmed cases. A boil water notice was in place for 158 days that affected 120,432 persons residing in the area, businesses, visitors, and commuters. This outbreak represented the largest outbreak of cryptosporidiosis in Ireland. The purpose of this study was to evaluate the cost of this outbreak. We adopted a societal perspective in estimating costs associated with the outbreak. Economic cost estimated was based on totaling direct and indirect costs incurred by public and private agencies. The cost of the outbreak was estimated based on 2007 figures. We estimate that the cost of the outbreak was >€19 million (≈€120,000/day of the outbreak). The US dollar equivalent based on today’s exchange rates would be $22.44 million (≈$142,000/day of the outbreak). This study highlights the economic need for a safe drinking water supply.
Rhoads, William J.; Garner, Emily; Ji, Pan; Zhu, Ni; Parks, Jeffrey; Schwake, David Otto; Pruden, Amy; Edwards, Marc A.
ENVIRONMENTAL SCIENCE & TECHNOLOGY, 51 (20):11986-11995; 10.1021/acs.est.7b01589OCT 17 2017
Abstract: We hypothesize that the increase in reported Legionnaires’ disease from June 2014 to November 2015 in Genesee County, MI (where Flint is located) was directly linked to the switch to corrosive Flint River water from noncorrosive Detroit water from April 2014 to October 2015. To address the lack of epidemiological data linking the drinking water supplies to disease incidence, we gathered physiochemical and biological water quality data from 2010 to 2016 to evaluate characteristics of the Flint River water that were potentially conducive to Legionella growth. The treated Flint River water was 8.6 times more corrosive than Detroit water in short-term testing, releasing more iron, which is a key Legionella nutrient, while also directly causing disinfectant to decay more rapidly. The Flint River water source was also 0.8–6.7 °C warmer in summer months than Detroit water and exceeded the minimum Legionella growth temperature of 20 °C more frequently (average number of days per year for Detroit was 63 versus that for the Flint River, which was 157). The corrosive water also led to 1.3–2.2 times more water main breaks in 2014–2015 compared to 2010–2013; such disruptions have been associated with outbreaks in other locales. Importantly, Legionella spp. and Legionella pneumophila decreased after switching back to Detroit water, in terms of both gene markers and culturability, when August and October 2015 were compared to November 2016.
Gang Liu; Ya Zhang; Mark, E. van der; Magic-Knezev, A.; Pinto, A.; Bogert, B. van den; Wentso Liu; Meer, W. van der; Medema, G
Water Research, 138 86-96; 10.1016/j.watres.2018.03.0432018
The general consensus is that the abundance of tap water bacteria is greatly influenced by water purification and distribution. Those bacteria that are released from biofilm in the distribution system are especially considered as the major potential risk for drinking water bio-safety. For the first time, this full-scale study has captured and identified the proportional contribution of the source water, treated water, and distribution system in shaping the tap water bacterial community based on their microbial community fingerprints using the Bayesian “SourceTracker” method. The bacterial community profiles and diversity analyses illustrated that the water purification process shaped the community of planktonic and suspended particle-associated bacteria in treated water. The bacterial communities associated with suspended particles, loose deposits, and biofilm were similar to each other, while the community of tap water planktonic bacteria varied across different locations in distribution system. The microbial source tracking results showed that there was not a detectable contribution of source water to bacterial community in the tap water and distribution system. The planktonic bacteria in the treated water was the major contributor to planktonic bacteria in the tap water (17.7–54.1%). The particle-associated bacterial community in the treated water seeded the bacterial community associated with loose deposits (24.9–32.7%) and biofilm (37.8–43.8%) in the distribution system. In return, the loose deposits and biofilm showed a significant influence on tap water planktonic and particle-associated bacteria, which were location dependent and influenced by hydraulic changes. This was revealed by the increased contribution of loose deposits to tap water planktonic bacteria (from 2.5% to 38.0%) and an increased contribution of biofilm to tap water particle-associated bacteria (from 5.9% to 19.7%) caused by possible hydraulic disturbance from proximal to distal regions. Therefore, our findings indicate that the tap water bacteria could possibly be managed by selecting and operating the purification process properly and cleaning the distribution system effectively.
Taravaud, Alexandre; Ali, Myriam; Lafosse, Bernard; Nicolas, Valerie; Feliers, Cedric; Thibert, Sylvie; Levi, Yves; Loiseau, Philippe M.; Pomel, Sebastien
SCIENCE OF THE TOTAL ENVIRONMENT, 633 157-166; 10.1016/j.scitotenv.2018.03.178AUG 15 2018
Free-living amoebae (FLA) are ubiquitous organisms present in various natural and artificial environments, such as drinking water storage towers (DWST). Some FLA, such as Acanthamoeba sp., Naegleria fowleri, and Balamuthia mandrillaris, can cause severe infections at ocular or cerebral level in addition to being potential reservoirs of other pathogens. In this work, the abundance and diversity of FLA was evaluated in two sampling campaigns: one performed over five seasons in three DWST at three different levels (surface, middle and bottom) in water and biofilm using microscopy and PCR, and one based on the kinetics analysis in phase contrast and confocal microscopy of biofilm samples collected every two weeks during a 3-month period at the surface and at the bottom of a DWST. In the seasonal study, the FLA were detected in each DWST water in densities of ~20 to 25 amoebae L−1. A seasonal variation of amoeba distribution was observed in water samples, with maximal densities in summer at ~30 amoebae L−1 and minimal densities in winter at ~16 amoebae L−1. The FLA belonging to the genus Acanthamoeba were detected in two spring sampling campaigns, suggesting a possible seasonal appearance of this potentially pathogenic amoeba. Interestingly, a 1 log increase of amoebae density was observed in biofilm samples collected at the surface of all DWST compared to the middle and the bottom where FLA were at 0.1–0.2 amoebae/cm2. In the kinetics study, an increase of amoebae density, total cell density, and biofilm thickness was observed as a function of time at the surface of the DWST, but not at the bottom. To our knowledge, this study describes for the first time a marked higher FLA density in biofilms collected at upper water levels in DWST, constituting a potential source of pathogenic micro-organisms.
Santé Publique France, 14 rue du Val-d’Osne, 94415 Saint-Maurice CEDEX, France
Received: 28 February 2018 / Revised: 20 April 2018 / Accepted: 24 April 2018 / Published: 26 April 2018
Time series studies (TSS) can be viewed as an inexpensive way to tackle the non-epidemic health risk from fecal pathogens in tap water in urban areas. Following the PRISMA recommendations, I reviewed TSS addressing the endemic risk of acute gastroenteritis risk according to drinking water operation conditions in urban areas of developed countries. Eighteen studies were included, covering 17 urban sites (seven in North-America and 10 in Europe) with study populations ranging from 50,000 to 9 million people. Most studies used general practitioner consultations or visits to hospitals for acute gastroenteritis (AGE) as health outcomes. In 11 of the 17 sites, a significant and plausible association was found between turbidity (or particle count) in finished water and the AGE indicator. When provided and significant, the interquartile excess of relative risk estimates ranged from 3–13%. When examined, water temperature, river flow, and produced flow were strongly associated with the AGE indicator. The potential of TSS for the study of the health risk from fecal pathogens in tap water is limited by the lack of specificity of turbidity and its site-sensitive value as an exposure proxy. Nevertheless, at the DWS level, TSS could help water operators to identify operational conditions most at risk, almost if considering other water operation indicators, in addition to turbidity, as possible relevant proxies for exposure. View Full-Text
Liu, Z., et al., Environmental Science and Pollution Research, 24(2):2126-2134, January 2017
With the widespread application of plastic pipes in drinking water distribution system, the effects of various leachable organic chemicals have been investigated and their occurrence in drinking water supplies is monitored. Most studies focus on the odor problems these substances may cause. This study investigates the potential endocrine disrupting effects of the migrating compound 2,4-di-tert-butylphenol (2,4-d-t-BP). The summarized results show that the migration of 2,4-d-t-BP from plastic pipes could result in chronic exposure and the migration levels varied greatly among different plastic pipe materials and manufacturing brands. Based on estrogen equivalent (EEQ), the migrating levels of the leachable compound 2,4-d-t-BP in most plastic pipes were relative low. However, the EEQ levels in drinking water migrating from four out of 15 pipes may pose significant adverse effects. With the increasingly strict requirements on regulation of drinking water quality, these results indicate that some drinking water transported with plastic pipes may not be safe for human consumption due to the occurrence of 2,4-d-t-BP. Moreover, 2,4-d-t-BP is not the only plastic pipe-migrating estrogenic compound, other compounds such as 2-tert-butylphenol (2-t-BP), 4-tert-butylphenol (4-t-BP), and others may also be leachable from plastic pipes.
Ragusa, A.T., and Crampton, A., Human Ecology, 44(5):565-576, October 2016
In the midst of popular and scientific debates about its desirability, safety and environmental sustainability, bottled water is forecast to become the most consumed packaged beverage globally (Feliciano 2014) and fastest growth sector in Australia (Johnson 2007). Manufacturers attribute increasing sales to convenience and health benefits rather than intensive advertising/marketing campaigns. Our sociological investigation of drinking water perceptions generally, and bottled water specifically, using data from 192 face-to-face interviews with Australians and New Zealanders, revealed 77 % thought about the quality of their drinking water; 64 % noted specific adverse issues, and 82 % reported concerns with their tap water. However, although 64 % drink bottled water, just 28 % believe it is better than tap water and 63 % consider it a waste of money. Only 21 % drink it for ‘convenience’ and consumption patterns vary significantly by gender, with men and younger generations purchasing the most bottled water. Qualitative analysis refutes stereotypes associating bottled water with a status symbol or lifestyle choice; participants largely mistrust water companies; just 13 % describe bottled water as a ‘trusted’ product, even when consumed for its taste or convenience, and 13 % label it a ‘bad’ plastic product detrimental to the environment or public health, thus lending support for institutional and policy trends banning bottled water.
Morgan, M.J., et.al., Environment & Technology, 50(6):2890-2898, March 2016
Free-living amoebae, such as Naegleria fowleri, Acanthamoeba spp., and Vermamoeba spp., have been identified as organisms of concern due to their role as hosts for pathogenic bacteria and as agents of human disease. In particular, N. fowleri is known to cause the disease primary amoebic meningoencephalitis (PAM) and can be found in drinking water systems in many countries. Understanding the temporal dynamics in relation to environmental and biological factors is vital for developing management tools for mitigating the risks of PAM. Characterizing drinking water systems in Western Australia with a combination of physical, chemical and biological measurements over the course of a year showed a close association of N. fowleri with free chlorine and distance from treatment over the course of a year. This information can be used to help design optimal management strategies for the control of N. fowleri in drinking-water-distribution systems.
Ngueta, G., Environmental Health Perspective, March 2016
Drinking water is recognized as a source of lead (Pb) exposure. However, questions remain about the impact of chronic exposure to lead-contaminated water on internal dose. Our goal was to estimate the relation between a cumulative water Pb exposure index (CWLEI) and blood Pb levels (BPb) in children 1–5 years of ages. Between 10 September 2009 and 27 March 2010, individual characteristics and water consumption data were obtained from 298 children. Venous blood samples were collected (one per child) and a total of five 1-L samples of water per home were drawn from the kitchen tap. A second round of water collection was performed between 22 June 2011 and 6 September 2011 on a subsample of houses. Pb analyses used inductively coupled plasma mass spectroscopy. Multiple linear regressions were used to estimate the association between CWLEI and BPb. Each 1-unit increase in CWLEI multiplies the expected value of BPb by 1.10 (95% CI: 1.06, 1.15) after adjustment for confounders. Mean BPb was significantly higher in children in the upper third and fourth quartiles of CWLEI (0.7–1.9 and ≥ 1.9 μg/kg of body weight) compared with the first (< 0.2 μg/kg) after adjusting for confounders (19%; 95% CI: 0, 42% and 39%; 95% CI: 15, 67%, respectively). The trends analysis yielded a p-value < 0.0001 after adjusting for confounders suggesting a dose–response relationship between percentiles of CWLEI and BPb. In children 1–5 years of age, BPb was significantly associated with water lead concentration with an increase starting at a cumulative lead exposure of ≥ 0.7 μg Pb/kg of body weight. In this age group, an increase of 1 μg/L in water lead would result in an increase of 35% of BPb after 150 days of exposure.
Kvitsand, H.M.L., Water Resources Research, 51:9725-9745, December 2015
Virus removal during rapid transport in an unconfined, low-temperature (6°C) sand and gravel aquifer was investigated at a riverbank field site, 25 km south of Trondheim in central Norway. The data from bacteriophage MS2 inactivation and transport experiments were applied in a two-site kinetic transport model using HYDRUS-1D, to evaluate the mechanisms of virus removal and whether these mechanisms were sufficient to protect the groundwater supplies. The results demonstrated that inactivation was negligible to the overall removal and that irreversible MS2 attachment to aquifer grains, coated with iron precipitates, played a dominant role in the removal of MS2; 4.1 log units of MS2 were removed by attachment during 38 m travel distance and less than 2 days residence time. Although the total removal was high, pathways capable of allowing virus migration at rapid velocities were present in the aquifer. The risk of rapid transport of viable viruses should be recognized, particularly for water supplies without permanent disinfection.
Rice, J., et.al., American Water Works Association Journal, 107.11:571-581, November 2015
A recently developed watershed-scale hydraulic model (De-facto Reuse Incidence in our Nation’s Consumptive Supply [DRINCS]) was applied to estimate municipal wastewater treatment plant (WWTP) contribution to downstream water treatment plant (WTP) influent flow. Using DRINCS and geocoded data for 14,651 WWTPs and 1,320 WTPs, the occurrence of treated municipal wastewater in drinking water supplies is geographically widespread, and its magnitude depends largely on the flow condition and size of the source river. Under average streamflow conditions in this study, the median contribution of wastewater flow to drinking water supplies was approximately 1% and increased to as much as 100% under low-flow conditions (modeled by Q95). Wastewater contributions to nutrient and emerging contaminant loading were estimated and geospatially compared with the findings of the US Environmental Protection Agency’s Unregulated Contaminant Monitoring Rule and Long Term 2 Enhanced Surface Water Treatment Rule. In turn, this analysis offers important insights into the treatment challenges facing treatment facilities across the United States.
McIlwain, B., et al., Journal of Environmental Engineering, November 2015
Exposure to lead in drinking water poses a risk for various adverse health effects, and significant efforts have been made to monitor and eliminate lead exposure in drinking water. This study focused on the localization of lead exposure from 71 drinking water fountains in nonresidential buildings in order to determine the source of elevated lead and understand the effects of fountains associated with lead concentration in drinking water. Drinking water fountains containing lead-lined cooling tanks and brass fittings were found to release lead concentrations in excess of 10 μg/L10 μg/L, and fountains with low or infrequent usage and those with cooling tanks produced the highest concentrations (in excess of 20 μg/L20 μg/L) of lead. One particular fountain model found at several locations throughout the institution was associated with some of the highest lead concentrations measured throughout the study. This fountain was recalled in the United States, but not in Canada. This article adds to existing research demonstrating that drinking water fountains are a potentially significant and underappreciated source of lead exposure in nonresidential buildings.
Cherci, C., et.al., Environmental Science & Technology, 49.22:13724-13732, October 2015
Holistic management of water and energy resources through energy and water quality management systems (EWQMSs) have traditionally aimed at energy cost reduction with limited or no emphasis on energy efficiency or greenhouse gas minimization. This study expanded the existing EWQMS framework and determined the impact of different management strategies for energy cost and energy consumption (e.g., carbon footprint) reduction on system performance at two drinking water utilities in California (United States). The results showed that optimizing for cost led to cost reductions of 4% (Utility B, summer) to 48% (Utility A, winter). The energy optimization strategy was successfully able to find the lowest energy use operation and achieved energy usage reductions of 3% (Utility B, summer) to 10% (Utility A, winter). The findings of this study revealed that there may be a trade-off between cost optimization (dollars) and energy use (kilowatt-hours), particularly in the summer, when optimizing the system for the reduction of energy use to a minimum incurred cost increases of 64% and 184% compared with the cost optimization scenario. Water age simulations through hydraulic modeling did not reveal any adverse effects on the water quality in the distribution system or in tanks from pump schedule optimization targeting either cost or energy minimization.
Pons, W., et al., PLoS ONE, October 2015
Reports of outbreaks in Canada and the United States (U.S.) indicate that approximately 50% of all waterborne diseases occur in small non-community drinking water systems (SDWSs). Summarizing these investigations to identify the factors and conditions contributing to outbreaks is needed in order to help prevent future outbreaks. The objectives of this study were to: 1) identify published reports of waterborne disease outbreaks involving SDWSs in Canada and the U.S. since 1970; 2) summarize reported factors contributing to outbreaks, including water system characteristics and events surrounding the outbreaks; and 3) identify terminology used to describe SDWSs in outbreak reports. Three electronic databases and grey literature sources were searched for outbreak reports involving SDWSs throughout Canada and the U.S. from 1970 to 2014. Two reviewers independently screened and extracted data related to water system characteristics and outbreak events. The data were analyzed descriptively with ‘outbreak’ as the unit of analysis. From a total of 1,995 citations, we identified 50 relevant articles reporting 293 unique outbreaks. Failure of an existing water treatment system (22.7%) and lack of water treatment (20.2%) were the leading causes of waterborne outbreaks in SDWSs. A seasonal trend was observed with 51% of outbreaks occurring in summer months (p<0.001). There was large variation in terminology used to describe SDWSs, and a large number of variables were not reported, including water source and whether water treatment was used (missing in 31% and 66% of reports, respectively). More consistent reporting and descriptions of SDWSs in future outbreak reports are needed to understand the epidemiology of these outbreaks and to inform the development of targeted interventions for SDWSs. Additional monitoring of water systems that are used on a seasonal or infrequent basis would be worthwhile to inform future protection efforts.
Siddhartha, R., et.al., Journal of Water and Health, 13.3:645-653, September 2015
The United States Environmental Protection Agency mandates that community water systems (or water utilities) provide annual consumer confidence reports (CCRs)–water quality reports–to their consumers. These reports encapsulate information regarding sources of water, detected contaminants, regulatory compliance, and educational material. These reports have excellent potential for providing the public with accurate information on the safety of tap water, but there is a lack of research on the degree to which the information can be understood by a large proportion of the population. This study evaluated the readability of a nationally representative sample of 30 CCRs, released between 2011 and 2013. Readability (or ‘comprehension difficulty’) was evaluated using Flesch-Kincaid readability tests. The analysis revealed that CCRs were written at the 11th-14th grade level, which is well above the recommended 6th-7th grade level for public health communications. The CCR readability ease was found to be equivalent to that of the Harvard Law Review journal. These findings expose a wide chasm that exists between current water quality reports and their effectiveness toward being understandable to US residents. Suggestions for reorienting language and scientific information in CCRs to be easily comprehensible to the public are offered.
Slabaugh, R.M., et.al., American Water Works Association Journal, 107.8:389-400, August 2015
Concerns that the current Lead and Copper Rule (LCR) may not adequately protect public health have prompted the US Environmental Protection Agency (USEPA) to consider restructuring existing monitoring requirements by targeting a redefined pool of high-risk sites or altering the sampling protocol. Analysis of historical lead and copper monitoring data from 18 public water systems (PWSs) verified that a significant percentage of PWSs with lead service lines are likely to be affected by potential Long-Term Lead and Copper Rule (LT-LCR) revisions. Data were used to facilitate a national cost-of-compliance estimate for additional implementation of corrosion control treatment (CCT) necessary to comply with the LT-LCR and potential unintended consequences associated with those treatment changes. Cost estimates presented here can be used by USEPA to shape the upcoming rule and also by PWSs to assess potential costs associated with optimizing CCT for LT-LCR compliance.
Ercumen, A., et.al., Environmental Health Perspectives, 122.7:651-660, July 2014
Water distribution systems are vulnerable to performance deficiencies that can cause (re)contamination of treated water and plausibly lead to increased risk of gastrointestinal illness (GII) in consumers. It is well established that large system disruptions in piped water networks can cause GII outbreaks. We hypothesized that routine network problems can also contribute to background levels of waterborne illness and conducted a systematic review and meta-analysis to assess the impact of distribution system deficiencies on endemic GII. We reviewed published studies that compared direct tap water consumption to consumption of tap water re-treated at the point of use (POU) and studies of specific system deficiencies such as breach of physical or hydraulic pipe integrity and lack of disinfectant residual. In settings with network malfunction, consumers of tap water versus POU-treated water had increased GII [incidence density ratio (IDR) = 1.34; 95% CI: 1.00, 1.79]. The subset of nonblinded studies showed a significant association between GII and tap water versus POU-treated water consumption (IDR = 1.52; 95% CI: 1.05, 2.20), but there was no association based on studies that blinded participants to their POU water treatment status (IDR = 0.98; 95% CI: 0.90, 1.08). Among studies focusing on specific network deficiencies, GII was associated with temporary water outages (relative risk = 3.26; 95% CI: 1.48, 7.19) as well as chronic outages in intermittently operated distribution systems (odds ratio = 1.61; 95% CI: 1.26, 2.07). It was concluded that tap water consumption is associated with GII in malfunctioning distribution networks. System deficiencies such as water outages also are associated with increased GII, suggesting a potential health risk for consumers served by piped water networks
Bruchet, A., et.al., Water and Science Technology:Water Supply, 14.3:383-389, June 2014
Laboratory tests were carried out with three types of new epoxy resins to assess the release of bisphenol A and F (BPA and BPF) and potential halogenated phenolic by-products. Tests were carried out over a duration of 6 months in the presence and absence of disinfectants (chlorine and chlorine dioxide) at realistic doses and contact times. None of the three systems exhibited Fickian-type diffusion for BPA. Leaching was quite low for two epoxies while the third showed a trend of increasing leaching during the first 5 months of immersion. BPA was only observed in the absence of disinfectant while no BPF was observed under any condition. 2,4,6-trichlorophenol (TCP), a BPA chlorination by-product was sporadically observed in the chlorinated water during the first months of contact. Following discontinuation of the disinfectants, its release was significantly enhanced in the water having been exposed to chlorinated water. Laboratory leaching tests also indicated rapid oxidation of epoxies by chlorine and chlorine dioxide. Analysis of 27 epoxy-coated drinking water storage tanks did not reveal any BPA, BPF or TCP. On the other hand, a large-scale examination of about 200 pipe sections rehabilitated with epoxies during the 1990s led to a high frequency of BPA and BPF detection, sometimes with maximum values around 1 μg/L.
Mathieu, L., et al., Water Research, May 2014
Attempts at removal of drinking water biofilms rely on various preventive and curative strategies such as nutrient reduction in drinking water, disinfection or water flushing, which have demonstrated limited efficiency. The main reason for these failures is the cohesiveness of the biofilm driven by the physico-chemical properties of its exopolymeric matrix (EPS). Effective cleaning procedures should break up the matrix and/or change the elastic properties of bacterial biofilms. The aim of this study was to evaluate the change in the cohesive strength of two-month-old drinking water biofilms under increasing hydrodynamic shear stress τw (from ∼0.2 to ∼10 Pa) and shock chlorination (applied concentration at T0: 10 mg Cl2/L; 60 min contact time). Biofilm erosion (cell loss per unit surface area) and cohesiveness (changes in the detachment shear stress and cluster volumes measured by atomic force microscopy (AFM)) were studied. When rapidly increasing the hydrodynamic constraint, biofilm removal was found to be dependent on a dual process of erosion and coalescence of the biofilm clusters. Indeed, 56% of the biofilm cells were removed with, concomitantly, a decrease in the number of the 50–300 μm3 clusters and an increase in the number of the smaller (i.e., 600 μm3) ones. Moreover, AFM evidenced the strengthening of the biofilm structure along with the doubling of the number of contact points, NC, per cluster volume unit following the hydrodynamic disturbance. This suggests that the compactness of the biofilm exopolymers increases with hydrodynamic stress. Shock chlorination removed cells (−75%) from the biofilm while reducing the volume of biofilm clusters. Oxidation stress resulted in a decrease in the cohesive strength profile of the remaining drinking water biofilms linked to a reduction in the number of contact points within the biofilm network structure in particular for the largest biofilm cluster volumes (>200 μm3). Changes in the cohesive strength of drinking water biofilms subsequent to cleaning/disinfection operations call into question the effectiveness of cleaning-in-place procedures. The combined alternating use of oxidation and shear stress sequences needs to be investigated as it could be an important adjunct to improving biofilm removal/reduction procedures.
Ahmed, W., Water Research, April 2014
In this study, quantitative PCR (qPCR) was used for the detection of four opportunistic bacterial pathogens in water samples collected from 72 rainwater tanks in Southeast Queensland, Australia. Tank water samples were also tested for fecal indicator bacteria (Escherichia coli and Enterococcus spp.) using culture-based methods. Among the 72 tank water samples tested, 74% and 94% samples contained E. coli and Enterococcus spp., respectively, and the numbers of E. coli and Enterococcus spp. in tank water samples ranged from 0.3 to 3.7 log₁₀ colony forming units (CFU) per 100 mL of water. In all, 29%, 15%, 13%, and 6% of tank water samples contained Aeromonas hydrophila, Staphylococcus aureus, Pseudomonas aeruginosa and Legionella pneumophila, respectively. The genomic units (GU) of opportunistic pathogens in tank water samples ranged from 1.5 to 4.6 log₁₀ GU per 100 mL of water. A significant correlation was found between E. coli and Enterococcus spp. numbers in pooled tank water samples data (Spearman’s rs = 0.50; P < 0.001). In contrast, fecal indicator bacteria numbers did not correlate with the presence/absence of opportunistic pathogens tested in this study. Based on the results of this study, it would be prudent, to undertake a Quantitative Microbial Risk Assessment (QMRA) analysis of opportunistic pathogens to determine associated health risks for potable and nonpotable uses of tank water.
Burroughs, A.D., and Rin, D., Food and Drug Law Institute, November/December 2012
This article examines three recent cases brought under the controversial Park doctrine in search of clues to the doctrine’s future. The responsible corporate officer (RCO) doctrine, also known as the Park doctrine, allows for criminal prosecution of individuals, typically high-ranking corporate executives of pharmaceutical companies, for violations of the Food, Drug and Cosmetic Act (FDCA), even absent any proof of the individual defendant’s knowledge of or participation in the violation. It is relevant to drinking water because the Park law applies to bottled water, but not to tap water.
Triantafyllidou, S. and Edwards, M., Critical Reviews in Environmental Science and Technology, June 2012
Lead is widely recognized as one of the most pervasive environmental health threats in the United States, and there is increased concern over adverse health impacts at levels of exposure once considered safe. Lead contamination of tap water was once a major cause of lead exposure in the United States and, as other sources have been addressed, the relative contribution of lead in water to lead in blood is expected to become increasingly important. Moreover, prior research suggests that lead in water may be more important as a source than is presently believed. The authors describe sources of lead in tap water, chemical forms of the lead, and relevant U.S. regulations/guidelines, while considering their implications for human exposure. Research that examined associations between water lead levels and blood lead levels is critically reviewed, and some of the challenges in making such associations, even if lead in water is the dominant source of lead in blood, are highlighted. Better protecting populations at risk from this and from other lead sources is necessary, if the United States is to achieve its goal of eliminating elevated blood lead levels in children by 2020.
Reynolds, K.A., Water Conditioning and Purification, February 2007
The quality of water at the end use is impacted by numerous and varied factors including source water type and quality, age of the distribution system, climatic events and even consumer use patterns. Therefore, providing high-quality drinking water at the tap requires a multi-barrier approach aimed at source water protection, source water treatment and reliable distribution. Each of these steps is monitored and controlled by municipal water treatment standards and guidelines; however, what happens to the water quality beyond the service connection at individual sites is not as well understood. New reports of water quality deterioration in the plumbing of residential or commercial buildings, known as premise plumbing, pose a question: Just what is present in our pipes?