Qualitative detection of E. coli in distributed drinking water using real-time reverse transcription PCR targeting 16S rRNA: Validation and practical experiences

Leo Heijnen , Hendrik Jan de Vries , Gabi van Pelt , Eline Stroobach , Adrie Atsma , Jerom Vranken , Katrien De Maeyer , Liesbeth Vissers , Gertjan Medema



  • A validated RT-PCR method targeting 16S ribosomal RNA to qualitatively E. coli in drinking water at a sensitivity of 1 CFU/100 ml.

  • The method is as sensitive as the reference culture method.

  • The E. coli RT-PCR showed high interlaboratory reproducibility.

  • An extensive comparison on practical samples showed good agreement between RT-PCR and the culture method.

  • Application of the RT-PCR will improve the safety of drinking water consumers.


Estimated Childhood Lead Exposure From Drinking Water in Chicago

Benjamin Q. Huynh, PhD1Elizabeth T. Chin, PhD2Mathew V. Kiang, ScD3

Key Points

Question  What is the extent and impact of lead-contaminated drinking water in Chicago, Illinois?

Findings  In this cross-sectional study, an estimated 68% of children younger than 6 years in Chicago are exposed to lead-contaminated drinking water, with 19% of affected children using unfiltered tap water as their primary drinking water source. Predominantly Black and Hispanic blocks were disproportionately less likely to be tested for lead yet disproportionately exposed to contaminated drinking water.

Meaning  Childhood lead exposure from drinking water is widespread in Chicago, with racial inequities in both testing rates and exposure levels.


Importance  There is no level of lead in drinking water considered to be safe, yet lead service lines are still commonly used in water systems across the US.

Objective  To identify the extent of lead-contaminated drinking water in Chicago, Illinois, and model its impact on children younger than 6 years.

Design, Setting, and Participants  For this cross-sectional study, a retrospective assessment was performed of lead exposure based on household tests collected from January 2016 to September 2023. Tests were obtained from households in Chicago that registered for a free self-administered testing service for lead exposure. Machine learning and microsimulation were used to estimate citywide childhood lead exposure.

Exposure  Lead-contaminated drinking water, measured in parts per billion.

Main Outcomes and Measures  Number of children younger than 6 years exposed to lead-contaminated water.

Results  A total of 38 385 household lead tests were collected. An estimated 68% (95% uncertainty interval, 66%-69%) of children younger than 6 years were exposed to lead-contaminated water, corresponding to 129 000 children (95% uncertainty interval, 128 000-131 000 children). Ten-percentage-point increases in block-level Black and Hispanic populations were associated with 3% (95% CI, 2%-3%) and 6% (95% CI, 5%-7%) decreases in odds of being tested for lead and 4% (95% CI, 3%-6%) and 11% (95% CI, 10%-13%) increases in having lead-contaminated drinking water, respectively.

Conclusions and Relevance  These findings indicate that childhood lead exposure is widespread in Chicago, and racial inequities are present in both testing rates and exposure levels. Machine learning may assist in preliminary screening for lead exposure, and efforts to remediate the effects of environmental racism should involve improving outreach for and access to lead testing services.

Four buildings and a flush: Lessons from degraded water quality and recommendations on building water management

Kyungyeon Ra , Caitlin Proctor , Christian Ley , Danielle Angert , Yoorae Noh , Tolulope Odimayomi , Andrew J. Whelton


A reduction in building occupancy can lead to stagnant water in plumbing, and the potential consequences for water quality have gained increasing attention. To investigate this, a study was conducted during the COVID-19 pandemic, focusing on water quality in four institutional buildings. Two of these buildings were old (>58 years) and large (>19,000 m2), while the other two were new (>13 years) and small (<11,000 m2). The study revealed significant decreases in water usage in the small buildings, whereas usage remained unchanged in the large buildings. Initial analysis found that residual chlorine was rarely detectable in cold/drinking water samples. Furthermore, the pH, dissolved oxygen, total organic carbon, and total cell count levels in the first draw of cold water samples were similar across all buildings. However, the ranges of heavy metal concentrations in large buildings were greater than observed in small buildings. Copper (Cu), lead (Pb), and manganese (Mn) sporadically exceeded drinking water limits at cold water fixtures, with maximum concentrations of 2.7 mg Cu L−1, 45.4 μg Pb L−1, 1.9 mg Mn L−1. Flushing the plumbing for 5 min resulted in detectable residual at fixtures in three buildings, but even after 125 min of flushing in largest and oldest building, no residual chlorine was detected at the fixture closest to the building’s point of entry. During the pandemic, the building owner conducted fixture flushing, where one to a few fixtures were operated per visit in buildings with hundreds of fixtures and multiple floors. However, further research is needed to understand the fundamental processes that control faucet water quality from the service line to the faucet. In the absence of this knowledge, building owners should create and use as-built drawings to develop flushing plans and conduct periodic water testing.


Association between drinking water quality and mental health and the modifying role of diet: a prospective cohort study

Zhou, S., Su, M., Shen, P. et al. Association between drinking water quality and mental health and the modifying role of diet: a prospective cohort study. BMC Med 22, 53 (2024). https://doi.org/10.1186/s12916-024-03269-3

Environmental factors play an important role in developing mental disorders. This study aimed to investigate the associations of metal and nonmetal elements in drinking water with the risk of depression and anxiety and to assess whether diets modulate these associations.


Water arsenic including in public water is linked to higher urinary arsenic totals among the U.S. population

Water arsenic including in public water is linked to higher urinary arsenic totals among the U.S. population

Highest concentrations found in the West and South and among Mexican American and other Hispanic participants

Date: April 20, 2023
Source: Columbia University’s Mailman School of Public Health
Summary: A new study shows that water arsenic levels are linked to higher urinary arsenic among the U.S. population for users of both private wells and public water systems.

Journal Reference:

  1. Maya Spaur, Melissa A. Lombard, Joseph D. Ayotte, Benjamin C. Bostick, Steven N. Chillrud, Ana Navas-Acien, Anne E. Nigra. Cross-sectional associations between drinking water arsenic and urinary inorganic arsenic in the United States: NHANES 2003–2014Environmental Research, 2023; 227: 115741 DOI: 10.1016/j.envres.2023.115741

A national survey of lead and other metal(loids) in residential drinking water in the United States

Karen D Bradham 1, Clay M Nelson 2, Tyler D Sowers 3, Darren A Lytle 4, Jennifer Tully 4, Michael R Schock 4, Kevin Li 5, Matthew D Blackmon 3, Kasey Kovalcik 3, David Cox 6, Gary Dewalt 6, Warren Friedman 7, Eugene A Pinzer 7, Peter J Ashley 7
Affiliations expand
PMID: 35986209 DOI: 10.1038/s41370-022-00461-6


Background: Exposure to lead (Pb), arsenic (As) and copper (Cu) may cause significant health issues including harmful neurological effects, cancer or organ damage. Determination of human exposure-relevant concentrations of these metal(loids) in drinking water, therefore, is critical.

Objective: We sought to characterize exposure-relevant Pb, As, and Cu concentrations in drinking water collected from homes participating in the American Healthy Homes Survey II, a national survey that monitors the prevalence of Pb and related hazards in United States homes.

Methods: Drinking water samples were collected from a national survey of 678 U.S. homes where children may live using an exposure-based composite sampling protocol. Relationships between metal(loid) concentration, water source and house age were evaluated.

Results: 18 of 678 (2.6%) of samples analyzed exceeded 5 µg Pb L1 (Mean = 1.0 µg L1). 1.5% of samples exceeded 10 µg As L1 (Mean = 1.7 µg L1) and 1,300 µg Cu L1 (Mean = 125 µg L1). Private well samples were more likely to exceed metal(loid) concentration thresholds than public water samples. Pb concentrations were correlated with Cu and Zn, indicative of brass as a common Pb source is samples analyzed.

Significance: Results represent the largest national-scale effort to date to inform exposure risks to Pb, As, and Cu in drinking water in U.S. homes using an exposure-based composite sampling approach.

Impact statement: To date, there are no national-level estimates of Pb, As and Cu in US drinking water collected from household taps using an exposure-based sampling protocol. Therefore, assessing public health impacts from metal(loids) in drinking water remains challenging. Results presented in this study represent the largest effort to date to test for exposure-relevant concentrations of Pb, As and Cu in US household drinking water, providing a critical step toward improved understanding of metal(loid) exposure risk.

Keywords: Arsenic; Copper; Drinking water; Human exposure.; Lead.

Outside the Safe Operating Space of a New Planetary Boundary for Per- and Polyfluoroalkyl Substances (PFAS)

Ian T. Cousins, Jana H. Johansson, Matthew E. Salter, Bo Sha, and Martin Scheringer

Environ. Sci. Technol. 2022, 56, 16, 11172–11179
Publication Date:August 2, 2022
Copyright © 2022 The Authors. Published by American Chemical Society

ABSTRACT: It is hypothesized that environmental contamina-
tion by per- and polyfluoroalkyl substances (PFAS) defines a separate planetary boundary and that this boundary has been
exceeded. This hypothesis is tested by comparing the levels of four
selected perfluoroalkyl acids (PFAAs) (i.e., perfluorooctanesulfonic acid (PFOS), perfluorooctanoic acid (PFOA), perfluorohexane-
sulfonic acid (PFHxS), and perfluorononanoic acid (PFNA)) in various global environmental media (i.e., rainwater, soils, and
surface waters) with recently proposed guideline levels. On the
basis of the four PFAAs considered, it is concluded that (1) levels
of PFOA and PFOS in rainwater often greatly exceed US
Environmental Protection Agency (EPA) Lifetime Drinking
Water Health Advisory levels and the sum of the aforementioned four PFAAs (Σ4 PFAS) in rainwater is often above Danish
drinking water limit values also based on Σ4 PFAS; (2) levels of PFOS in rainwater are often above Environmental Quality Standard
for Inland European Union Surface Water; and (3) atmospheric deposition also leads to global soils being ubiquitously
contaminated and to be often above proposed Dutch guideline values. It is, therefore, concluded that the global spread of these four
PFAAs in the atmosphere has led to the planetary boundary for chemical pollution being exceeded. Levels of PFAAs in atmospheric
deposition are especially poorly reversible because of the high persistence of PFAAs and their ability to continuously cycle in the
hydrosphere, including on sea spray aerosols emitted from the oceans. Because of the poor reversibility of environmental exposure to
PFAS and their associated effects, it is vitally important that PFAS uses and emissions are rapidly restricted.
KEYWORDS: PFAS, planetary boundary, chemical pollution, environmental exposure

Ultra-Short-Chain PFASs in the Sources of German Drinking Water: Prevalent, Overlooked, Difficult to Remove, and Unregulated


Abstract Image

Per- and polyfluoroalkyl substances (PFASs) have been a focal point of environmental chemistry and chemical regulation in recent years, culminating in a shift from individual PFAS regulation toward a PFAS group regulatory approach in Europe. PFASs are a highly diverse group of substances, and knowledge about this group is still scarce beyond the well-studied, legacy long-chain, and short-chain perfluorocarboxylates (PFCAs) and perfluorosulfonates (PFSAs). Herein, quantitative and semiquantitative data for 43 legacy short-chain and ultra-short-chain PFASs (≤2 perfluorocarbon atoms for PFCAs, ≤3 for PFSAs and other PFASs) in 46 water samples collected from 13 different sources of German drinking water are presented. The PFASs considered include novel compounds like hexafluoroisopropanol, bis(trifluoromethylsulfonyl)imide, and tris(pentafluoroethyl)trifluorophosphate. The ultra-short-chain PFASs trifluoroacetate, perfluoropropanoate, and trifluoromethanesulfonate were ubiquitous and present at the highest concentrations (98% of sum target PFAS concentrations). “PFAS total” parameters like the adsorbable organic fluorine (AOF) and total oxidizable precursor (TOP) assay were found to provide only an incomplete picture of PFAS contamination in these water samples by not capturing these highly prevalent ultra-short-chain PFASs. These ultra-short-chain PFASs represent a major challenge for drinking water production and show that regulation in the form of preventive measures is required to manage them.


Comparative effectiveness of membrane technologies and disinfection methods for virus elimination in water: A review

Chao Chen 1Lihui Guo 2Yu Yang 3Kumiko Oguma 4Li-An Hou 5



The pandemic of the 2019 novel coronavirus disease (COVID-19) has brought viruses into the public horizon. Since viruses can pose a threat to human health in a low concentration range, seeking efficient virus removal methods has been the research hotspots in the past few years. Herein, a total of 1060 research papers were collected from the Web of Science database to identify technological trends as well as the research status. Based on the analysis results, this review elaborates on the state-of-the-art of membrane filtration and disinfection technologies for the treatment of virus-containing wastewater and drinking water. The results evince that membrane and disinfection methods achieve a broad range of virus removal efficiency (0.5-7 log reduction values (LRVs) and 0.09-8 LRVs, respectively) that is attributable to the various interactions between membranes or disinfectants and viruses having different susceptibility in viral capsid protein and nucleic acid. Moreover, this review discusses the related challenges and potential of membrane and disinfection technologies for customized virus removal in order to prevent the dissemination of the waterborne diseases.

Keywords: Disinfection; Drinking water treatment; Membrane; Virus removal; Wastewater treatment.

Relationships between regulated DBPs and emerging DBPs of health concern in U.S. drinking water

Stuart W.Krasner1⁎⁎AiJia1Chih-Fen T.Lee1RahaShirkhani1Joshua M.Allen2⁎⁎⁎Susan D.Richardson2Michael J.Plewa34



A survey was conducted at eight U.S. drinking water plants, that spanned a wide range of water qualities and treatment/disinfection practices. Plants that treated heavily-wastewater-impacted source waters had lower trihalomethane to dihaloacetonitrile ratios due to the presence of more organic nitrogen and HAN precursors. As the bromide to total organic carbon ratio increased, there was more bromine incorporation into DBPs. This has been shown in other studies for THMs and selected emerging DBPs (HANs), whereas this study examined bromine incorporation for a wider group of emerging DBPs (haloacetaldehydes, halonitromethanes). Moreover, bromine incorporation into the emerging DBPs was, in general, similar to that of the THMs. Epidemiology studies that show an association between adverse health effects and brominated THMs may be due to the formation of brominated emerging DBPs of heath concern. Plants with higher free chlorine contact times before ammonia addition to form chloramines had less iodinated DBP formation in chloraminated distribution systems, where there was more oxidation of the iodide to iodate (a sink for the iodide) by the chlorine. This has been shown in many bench-scale studies (primarily for iodinated THMs), but seldom in full-scale studies (where this study also showed the impact on total organic iodine. Collectively, the THMs, haloacetic acids, and emerging DBPs accounted for a significant portion of the TOCl, TOBr, and TOI; however, ∼50% of the TOCl and TOBr is still unknown. The correlation of the sum of detected DBPs with the TOCl and TOBr suggests that they can be used as reliable surrogates.

Using Water Intake Dietary Recall Data to Provide a Window into US Water Insecurity

Asher Y Rosinger

The Journal of Nutrition, Volume 152, Issue 5, May 2022, Pages 1263–1273, https://doi.org/10.1093/jn/nxac017



In the United States, problems with the provision of safe, affordable water have resulted in an increasing number of adults who avoid their tap water, which could indicate underlying water insecurity. Dietary recalls provide critical nutritional surveillance data, yet have been underexplored as a water insecurity monitoring tool.


This article aims to demonstrate how water intake variables from dietary recall data relate to and predict a key water insecurity proxy, that is, tap water avoidance.


Using 2005–2018 NHANES data from 32,329 adults, I examine distributions and trends of mean intakes of total, plain (sum of tap and bottled water), tap, and bottled water, and percentage consuming no tap and exclusive bottled water. Second, I use multiple linear and logistic regressions to test how tap water avoidance relates to plain water intake and sugar-sweetened beverage (SSB) consumption. Next, I use receiver operating characteristics (ROC) curves to test the predictive accuracy of no plain water, no tap, and exclusive bottled water intake, and varying percentages of plain water consumed from tap water compared with tap water avoidance.


Trends indicate increasing plain water intake between 2005 and 2018, driven by increasing bottled water intake. In 2017–18, 51.4% of adults did not drink tap water on a given day, whereas 35.8% exclusively consumed bottled water. Adults who avoided their tap water consumed less tap and plain water, and significantly more bottled water and SSBs on a given day. No tap intake and categories of tap water intake produced 77% and 78% areas under the ROC curve in predicting tap water avoidance.


This study demonstrates that water intake variables from dietary recalls can be used to accurately predict tap water avoidance and provide a window into water insecurity. Growing reliance on bottled water could indicate increasing concerns about tap water.

Occurrence of nitrosamines and their precursors in North American drinking waters.

Krasner, S. W.Roback, S.; (…); Bukhari, Z.

2020 | AWWA Water Science

Eight N-nitrosamines were measured at 37 water plants in the United States and Canada. Five tobacco-specific nitrosamines (TSNAs) were measured in selected waters. N-Nitrosodimethylamine (NDMA) was preferentially formed in chloraminated systems (maximum detention time: median 4.4ng/L). A small amount was detected in some chlorinated systems (90th percentile <2.0 ng/L). After ozone (before chloramines), NDMA was sometimes detected (90th percentile 2.9 ng/L), suggesting that the ozone did not react with precursors to form NDMA. The chloramine plants that temporarily switched to chlorine typically produced less NDMA (Plant 29 reduced NDMA formation, on average, from 34 to 4 ng). More NDMA was produced during spring runoff, when there were elevated levels of ammonia and NDMA precursors in the source water. More NDMA was formed when there were higher levels of poly (diallyldimethylammonium chloride) (polyDADMAC) used. N-Nitrosomorpholene was found to be a contaminant and not a disinfection byproduct; it did not increase during chloramination. TSNAs were produced during spring runoff; source water ammonia impacted the chlor (am) ine chemistry. © 2020 American Water Works Association

Developing a framework for classifying water lead levels at private drinking water systems: A Bayesian Belief Network approach


The presence of lead in drinking water creates a public health crisis, as lead causes neurological damage at low levels of exposure. The objective of this research is to explore modeling approaches to predict the risk of lead at private drinking water systems. This research uses Bayesian Network approaches to explore interactions among household characteristics, geological parameters, observations of tap water, and laboratory tests of water quality parameters. A knowledge discovery framework is developed by integrating methods for data discretization, feature selection, and Bayes classifiers. Forward selection and backward selection are explored for feature selection. Discretization approaches, including domain-knowledge, statistical, and information-based approaches, are tested to discretize continuous features. Bayes classifiers that are tested include General Bayesian Network, Naive Bayes, and Tree-Augmented Naive Bayes, which are applied to identify Directed Acyclic Graphs (DAGs). Bayesian inference is used to fit conditional probability tables for each DAG. The Bayesian framework is applied to fit models for a dataset collected by the Virginia Household Water Quality Program (VAHWQP), which collected water samples and conducted household surveys at 2,146 households that use private water systems, including wells and springs, in Virginia during 2012 and 2013. Relationships among laboratory-tested water quality parameters, observations of tap water, and household characteristics, including plumbing type, source water, household location, and on-site water treatment are explored to develop features for predicting water lead levels. Results demonstrate that Naive Bayes classifiers perform best based on recall and precision, when compared with other classifiers. Copper is the most significant predictor of lead, and other important predictors include county, pH, and on-site water treatment. Feature selection methods have a marginal effect on performance, and discretization methods can greatly affect model performance when paired with classifiers. Owners of private wells remain disadvantaged and may be at an elevated level of risk, because utilities and governing agencies are not responsible for ensuring that lead levels meet the Lead and Copper Rule for private wells. Insight gained from models can be used to identify water quality parameters, plumbing characteristics, and household variables that increase the likelihood of high water lead levels to inform decisions about lead testing and treatment.

Keywords: Bayesian Belief Network; Contamination Classification; Lead in Drinking Water; Water Quality.

Application of Capsid Integrity (RT-)qPCR to Assessing Occurrence of Intact Viruses in Surface Water and Tap Water in Japan


Capsid integrity (RT-)qPCR has recently been developed to discriminate between intact forms from inactivated forms of viruses, but its applicability to identifying integrity of viruses in drinking water has remained limited. In this study, we investigated the application of capsid integrity (RT-)qPCR using cis-dichlorodiammineplatinum (CDDP) with sodium deoxycholate (SD) pretreatment (SD-CDDP-(RT-)qPCR) to detect intact viruses in surface water and tap water. A total of 63 water samples (surface water, n = 20; tap water, n = 43) were collected in the Kanto region in Japan and quantified by conventional (RT)-qPCR and SD-CDDP-(RT-)qPCR for pepper mild mottle virus (PMMoV) and seven other viruses pathogenic to humans (Aichivirus (AiV), noroviruses of genotypes I and II, enterovirus, adenovirus type 40 and 41, and JC and BK polyomaviruses). In surface water, PMMoV (100%) was more frequently detected than other human pathogenic viruses (30%-60%), as determined by conventional (RT-)qPCR. SD-CDDP-(RT-)qPCR also revealed that intact PMMoV (95%) was more common than intact human pathogenic viruses (20%-45%). In the tap water samples, most of the target viruses were not detected by conventional (RT-)qPCR, except for PMMoV (9%) and AiV (5%). PMMoV remained positive (5%), whereas no AiV was detected when tested by SD-CDDP-(RT-)qPCR, indicating that some PMMoV had an intact capsid, whereas AiV had damaged capsids. The presence of AiV in the absence of PMMoV in tap water produced from groundwater may demonstrate the limitation of PMMoV as a viral indicator in groundwater. In addition to being abundant in surface water, PMMoV was detected in tap water, including PMMoV with intact capsids. Thus, the absence of intact PMMoV may be used to guarantee the viral safety of tap water produced from surface water.

Keywords: Capsid integrity (RT-)qPCR; drinking water; intact virus; viral indicator; virus occurrence.

A Methodology for Assessing Groundwater Pollution Hazard by Nitrates from Agricultural Sources: Application to the Gallocanta Groundwater Basin (Spain)

SUSTAINABILITY   Volume: ‏ 13   Issue: ‏ 11     Article Number: 6321   Published: ‏ JUN 2021 – https://doi.org/10.3390/su13116321

Groundwater pollution by nitrates from agricultural sources is a common environmental issue. In order to support risk analysis, hazard maps are used to classify land uses according to their potential of pollution. The aim of this study is to propose a new hazard index based on nitrogen input and its connection with nitrate concentration in groundwater. The effectiveness of the Nitrogen Input Hazard Index was tested in the Gallocanta Groundwater Basin (Spain), a highly polluted area, declared as a Nitrate Vulnerable Zone. Agricultural data at a plot scale were used to estimate the nitrogen fertilizer requirement of each crop, and the correlation between nitrogen input and nitrate concentration in groundwater was explored. The resulting hazard map allows us to delimit the most hazardous areas, which can be used to implement more accurate nitrate pollution control programs. The index was proven to successfully estimate nitrogen input influence over groundwater nitrate concentration, and to be able to create hazard maps. The criterion used to create categories was empirically based on nitrate concentration thresholds established by the EU Nitrate Directive. The Nitrogen Input Hazard Index may be a useful tool to support risk analyses of agricultural activities in vulnerable areas, where nitrate pollution could endanger human water supply. View Full-Text

Analysis of microplastics in drinking water and other clean water samples with micro-Raman and micro-infrared spectroscopy: minimum requirements and best practice guidelines

Authors:  Schymanski, D., Oßmann, B. E., Benismail, N., Boukerma, K., Dallmann, G., Esch, E. von der, Fischer, D., Fischer, F., Gilliland, D., Glas, K., Hofmann, T., Käppler, A., Lacorte, S., Marco, J., Rakwe, M. EL, Weisser, J., Witzig, C., Zumbülte, N., & Ivleva, N. P. (2021). Analysis of microplastics in drinking water and other clean water samples with micro-Raman and micro-infrared spectroscopy: minimum requirements and best practice guidelines. Analytical and Bioanalytical Chemistry 2021, 1–26. https://doi.org/10.1007/S00216-021-03498-Y


Microplastics are a widespread contaminant found not only in various natural habitats but also in drinking waters. With spectroscopic methods, the polymer type, number, size, and size distribution as well as the shape of microplastic particles in waters can be determined, which is of great relevance to toxicological studies. Methods used in studies so far show a huge diversity regarding experimental setups and often a lack of certain quality assurance aspects. To overcome these problems, this critical review and consensus paper of 12 European analytical laboratories and institutions, dealing with microplastic particle identification and quantification with spectroscopic methods, gives guidance toward harmonized microplastic particle analysis in clean waters. The aims of this paper are to (i) improve the reliability of microplastic analysis, (ii) facilitate and improve the planning of sample preparation and microplastic detection, and (iii) provide a better understanding regarding the evaluation of already existing studies. With these aims, we hope to make an important step toward harmonization of microplastic particle analysis in clean water samples and, thus, allow the comparability of results obtained in different studies by using similar or harmonized methods. Clean water samples, for the purpose of this paper, are considered to comprise all water samples with low matrix content, in particular drinking, tap, and bottled water, but also other water types such as clean freshwater.


Discussion and conclusions

The lack of harmonized methods and analytical standard substances and the difficulty to validate methods for the variable and even sometimes contradictory data [15]. The proposed quality criteria by Koelmans and colleagues include the sampling method, sample size, sample processing and storage, laboratory preparation, clean air conditions, positive and negative controls, sample treatment, and polymer identification. The present consensus paper discusses and sums up details regarding the most important spectroscopic methods that can be used for MP analysis in clean water. All of the above-mentioned quality criteria were integrated in this guideline (see Table 1). It allows the reader to compare and evaluate existing studies. Furthermore, the guidelines can be used to better understand and thus make a more advantageous choice when setting up MP research studies. Given best practice approaches will contribute to a better harmonization of analytical methods for MP analysis in clean water samples down to 1 μm. A schematic overview of important precautions for MP analysis and sampling advices are given in Figure 1.

All these elements are intended to support the standardization processes throughout the different normalization committees. While this consensus paper from twelve European analytical laboratories and institutions has concentrated on (FT)IR/RM methods, for the purpose of monitoring as well as gaining a more comprehensive knowledge on MP contamination in food, water, air, and environmental samples, both spectroscopic and thermo-analytical methods are required. Therefore, it will also be important that a similar consideration be given to harmonizing thermo-analytical methods for MP detection. Above all, an ongoing exchange of scientists and laboratories, ILC studies with certified polymer standards and coordinating structures are required. These will pave the way to enable progress in the harmonization and standardization of MP detection and to allow for representative and reliable MP analysis in different environmental and food samples.

Associations between private well water and community water supply arsenic concentrations in the conterminous United States

Authors: Spaur, MayaLombard, Melissa A.Ayotte, Joseph D.; et al.


Geogenic arsenic contamination typically occurs in groundwater as opposed to surface water supplies. Groundwater is a major source for many community water systems (CWSs) in the United States (US). Although the US Environmental Protection Agency sets the maximum contaminant level (MCL enforceable since 2006: 10 μg/L) for arsenic in CWSs, private wells are not federally regulated. We evaluated county-level associations between modeled values of the probability of private well arsenic exceeding 10 μg/L and CWS arsenic concentrations for 2231 counties in the conterminous US, using time invariant private well arsenic estimates and CWS arsenic estimates for two time periods. Nationwide, county-level CWS arsenic concentrations increased by 8.4 μg/L per 100% increase in the probability of private well arsenic exceeding 10 μg/L for 2006-2008 (the initial compliance monitoring period after MCL implementation), and by 7.3 μg/L for 2009-2011 (the second monitoring period following MCL implementation) (1.1 μg/L mean decline over time). Regional differences in this temporal decline suggest that interventions to implement the MCL were more pronounced in regions served primarily by groundwater. The strong association between private well and CWS arsenic in Rural, American Indian, and Semi Urban, Hispanic counties suggests that future research and regulatory support are needed to reduce water arsenic exposures in these vulnerable subpopulations. This comparison of arsenic exposure values from major private and public drinking water sources nationwide is critical to future assessments of drinking water arsenic exposure and health outcomes.

Effect of concentration on virus removal for ultrafiltration membrane in drinking water production

Authors: Jacquet, N.Wurtzer, S.Darracq, G.; et al.

Removal of pathogenic microorganisms as viruses during drinking water production was evaluated by ultrafiltration. Two enteric viruses (ADV 41 and CV-B5) were compared to the MS2 bacteriophage, largely used in literature and by membrane producers as enteric virus surrogate. The effect of feed concentration of viruses on the ultrafiltration efficiency has been assessed. For the three viruses, low retentions about 1 log were observed at the lowest concentrations. At higher concentrations, an increase of removal up to 3.0 log for CV-B5 and MS2 phage and 3.5 log for ADV 41 was observed. These results highlight the potential overestimation of UF efficiency during laboratory experiments realized at high concentrations, compared to low concentrations found in environmental resources used for drinking water production. Virus removals with Evian water and real groundwater were compared and groundwater achieved similar or slightly higher removals for the three viruses. Finally, impact of membrane ageing after chlorine exposure was checked. It was observed that membrane degradations, visible by a water permeability increase with exposure dose did not affect the removal of viruses at low feed concentrations.


A Methodology for Assessing Groundwater Pollution Hazard by Nitrates from Agricultural Sources: Application to the Gallocanta Groundwater Basin (Spain)

Authors: Orellana-Macias, Jose Maria; Perles Rosello, Maria Jesus; Causape, Jesus

Abstract: Groundwater pollution by nitrates from agricultural sources is a common environmental issue. In order to support risk analysis, hazard maps are used to classify land uses according to their potential of pollution. The aim of this study is to propose a new hazard index based on nitrogen input and its connection with nitrate concentration in groundwater. The effectiveness of the Nitrogen Input Hazard Index was tested in the Gallocanta Groundwater Basin (Spain), a highly polluted area, declared as a Nitrate Vulnerable Zone. Agricultural data at a plot scale were used to estimate the nitrogen fertilizer requirement of each crop, and the correlation between nitrogen input and nitrate concentration in groundwater was explored. The resulting hazard map allows us to delimit the most hazardous areas, which can be used to implement more accurate nitrate pollution control programs. The index was proven to successfully estimate nitrogen input influence over groundwater nitrate concentration, and to be able to create hazard maps. The criterion used to create categories was empirically based on nitrate concentration thresholds established by the EU Nitrate Directive. The Nitrogen Input Hazard Index may be a useful tool to support risk analyses of agricultural activities in vulnerable areas, where nitrate pollution could endanger human water supply.


Epidemiology of Water-Associated Infectious Diseases

Authors: Kumar, Swatantra; Haikerwal, Amrita; Saxena, Shailendra K.

Abstract: Infection pervasiveness is significantly related to the exposure and rate of transmission which are influenced by ecological factors, for example, precipitation, air/water temperature, and seasonal variability. Vibrio cholerae is solely responsible for approximately 1.7 million cases annually with 525,000 deaths in children below 5 years. Similarly, enteric fever (typhoid) is a severe systemic infection and is the foremost public health water-borne infectious disease with an estimated 26 million cases annually in the same way. Giardia intestinalis is the foremost cause of parasitic infection in the USA with an estimated 1.2 million cases and 3581 reported hospitalizations annually. So far, three species of schistosome have been archived including Schistosoma haematobium which causes urogenital disease in sub-Saharan Africa. According to WHO-World Malaria Report-2016, 212 million cases along with 429,000 deaths were reported in the year 2015. Shigellosis is caused by a group of bacteria known as Shigella with an estimated 500,000 cases annually in the USA. Recently around 6000 cases of Legionellosis were reported in the USA in the year 2015.


Exposure, health effects, sensing, and remediation of the emerging PFAS contaminants – Scientific challenges and potential research directions

Authors: Erin M.Bella1SylvainDe Guiseb1Jeffrey R.McCutcheonc1YuLeic1MiltonLevinbBaikunLidJames F.RuslingeijDavid A.LawrenceafJennifer M.CavallarigCaitlinO’ConnellhBethanyJavidihXinyuWangdHeejeongRyuc


Abstract: Per- and polyfluoroalkyl substances (PFAS) make up a large group of persistent anthropogenic chemicals which are difficult to degrade and/or destroy. PFAS are an emerging class of contaminants, but little is known about the long-term health effects related to exposure. In addition, technologies to identify levels of contamination in the environment and to remediate contaminated sites are currently inadequate. In this opinion-type discussion paper, a team of researchers from the University of Connecticut and the University at Albany discuss the scientific challenges in their specific but intertwined PFAS research areas, including rapid and low-cost detection, energy-saving remediation, the role of T helper cells in immunotoxicity, and the biochemical and molecular effects of PFAS among community residents with measurable PFAS concentrations. Potential research directions that may be employed to address those challenges and improve the understanding of sensing, remediation, exposure to, and health effects of PFAS are then presented. We hope our account of emerging problems related to PFAS contamination will encourage a broad range of scientific experts to bring these research initiatives addressing PFAS into play on a national scale.


Coronavirus in water media: Analysis, fate, disinfection and epidemiological applications

Authors: AntonioBuonerbaabMary Vermi AizzaCorpuzcFlorencioBallesteroscKwang-HoChoodShadi W.HasaneGregory V.KorshinfVincenzoBelgiornoaDamiàBarcelógVincenzoNaddeo

Abstract: Considerable attention has been recently given to possible transmission of SARS-CoV-2 via water media. This review addresses this issue and examines the fate of coronaviruses (CoVs) in water systems, with particular attention to the recently available information on the novel SARS-CoV-2. The methods for the determination of viable virus particles and quantification of CoVs and, in particular, of SARS-CoV-2 in water and wastewater are discussed with particular regard to the methods of concentration and to the emerging methods of detection. The analysis of the environmental stability of CoVs, with particular regard of SARS-CoV-2, and the efficacy of the disinfection methods are extensively reviewed as well. This information provides a broad view of the state-of-the-art for researchers involved in the investigation of CoVs in aquatic systems, and poses the basis for further analyses and discussions on the risk associated to the presence of SARS-CoV-2 in water media. The examined data indicates that detection of the virus in wastewater and natural water bodies provides a potentially powerful tool for quantitative microbiological risk assessment (QMRA) and for wastewater-based epidemiology (WBE) for the evaluation of the level of circulation of the virus in a population. Assays of the viable virions in water media provide information on the integrity, capability of replication (in suitable host species) and on the potential infectivity. Challenges and critical issues relevant to the detection of coronaviruses in different water matrixes with both direct and surrogate methods as well as in the implementation of epidemiological tools are presented and critically discussed.


Proposal of new health risk assessment method for deficient essential elements in drinking water-case study of the Slovak Republic.

By: Rapant, S.Cveckova, V.Hiller, E.; et al.

International Journal of Environmental Research and Public Health  Volume: ‏ 17   Issue: ‏ 16   Pages: ‏ 5915   Published: ‏ 2020


The US EPA health risk assessment method is currently widely used to assess human health risks for many environmental constituents. It is used for risk assessment from the exposure to various contaminants exceeding tolerable or safe reference doses, determined e.g., for drinking water, soil, air and food. It accepts widely that excess contents of non-essential elements (e.g., As, Pb or Sb) in environmental compartments represent a general risk to human health. However, contrary to toxic trace elements, deficient contents of essential (biogenic) elements e.g., F, I, Se, Zn, Fe, Ca or Mg may represent even higher health risk. Therefore, we propose to extend the human health risk assessment by calculating the health risk for deficient content and intake of essential elements, and to introduce the terms Average Daily Missing Dose (ADMD), Average Daily Required Dose (ADRD) and Average Daily Accepted Dose (ADAD). We propose the following equation to calculate the Hazard Quotient (HQ) of health risk from deficient elements: HQd=ADRD/ADAD. At present, there are no reference concentrations or doses of essential elements in each environmental compartment in world databases (Integrated Risk Information System IRIS, The Risk Assessment Information System RAIS). ADRD and ADMD can be derived from different regulatory standards or guidelines (if they exist) or calculated from actual regional data on the state of population health and content of essential elements in the environment, e.g., in groundwater or soil. This methodology was elaborated and tested on inhabitants of the Slovak Republic supplied with soft drinking water with an average Mg content of 5.66 mg. L-1. The calculated ADMD of Mg for these inhabitants is 0.314 mg.kg-1.day-1 and HQd is equal to 2.94, indicating medium risk of chronic diseases. This method extending traditional health risk assessment is the first attempt to quantify deficient content of essential elements in drinking water. It still has some limitations but also has potential to be further developed and refined through its testing in other countries. © 2020 by the authors. Licensee MDPI, Basel, Switzerland.



Challenges and Solutions for Sustainable Groundwater Usage: Pollution Control and Integrated Management

By: Syafiuddin, AchmadBoopathy, RajHadibarata, Tony



Purpose of Review This paper aims to critically review the current status of groundwater usage from the point of view of pollutant control and integrated management. Recent Findings This paper has shown that sustainable efforts must be encouraged to minimize the arsenic content from all the possible sources before entering the groundwater system. Excessive nitrate and pesticide utilization must be significantly reduced for a sustainable environment. Although various in situ remediation technologies are possible to remove some contaminants in the groundwater, the future concern is how it can be carried out in accordance with environmental sustainable goal such as the implementation of in situ bioremediation and bioelectroremediation which provide a cheaper and greener solution compared to physical and chemical approaches. To develop a successful integrated management for a sustainable groundwater usage in the future, conjunctive water management is recommended as it involves the management of ground and surface water resources to enhance security of water supply and environmental sustainability. This paper critically reviews the current state of knowledge concerning groundwater usage from the point of view of pollutant control and integrated management. Information presented in this paper is highly useful for the management of groundwater not only in the quality point of view but also in the sustainable quantity for future development.



Drinking-water nitrate and cancer risk: A systematic review and meta-analysis

By: Essien, Eno E.Said Abasse, KassimCote, Andre; et al.


Early Access: NOV 2020


Nitrate is an inorganic compound that occurs naturally in all surface and groundwater, although higher concentrations tend to occur only where fertilizers are used on the land. The regulatory limit for nitrate in public drinking water supplies was set to protect against infant methemoglobinemia, but other health effects were not considered. Risk of specific cancers and congenital disabilities may be increased when the nitrate is ingested, and nitrate is reduced to nitrite, which can react with amines and amides by nitrosation to form N-nitroso compounds which are known animal carcinogens. This study aims to evaluate the association between nitrate ingested through drinking water and the risk of developing cancers in humans.


We performed a systematic review following PRISMA and MOOSE guidelines. A literature search was performed using PubMed, EMBASE, the Cochrane Library databases, Web of Science and Google Scholars in the time-frame from their inception to January 2020, for potentially eligible publications. STATA version 12.0 was used to conduct meta-regression and a two-stage meta-analysis.


A total of 48 articles with 13 different cancer sites were used for analysis. The meta-regression analysis showed stomach cancer had an association with the median dosage of nitrate from drinking water (t = 3.98, p = 0.0001, and adjusted R-squared = 50.61%), other types of cancers didn’t show any association. The first stage of meta-analysis showed there was an association only between the risk of brain cancer & glioma (OR = 1.15, 95% CI: 1.06, 1.24) and colon cancer (OR = 1.11, 95% CI: 1.04, 1.17) and nitrate consumption in the analysis comparing the highest ORs versus the lowest. The 2(nd) stage showed there was an association only between the risk colon cancer (OR = 1.14, 95% CI: 1.04, 1.23) and nitrate consumption in the analysis comparing all combined higher ORs versus the lowest.


This study showed that there is an association between the intake of nitrate from drinking water and a type of cancer in humans. The effective way of controlling nitrate concentrations in drinking water is the prevention of contamination (water pollution). Further research work on this topic is needed.



Recent US State and Federal Drinking Water Guidelines for Per‐ and Polyfluoroalkyl Substances

By: Post, Gloria B.


Early Access: NOV 2020

Full Text from Publisher

Close Abstract

Per- and polyfluoroalkyl substances (PFAS), a class of synthetic chemicals produced for over 70 years, are of increasing concern because of their widespread environmental presence, extreme persistence, bioaccumulative nature, and evidence for health effects from environmentally relevant exposures. In 2016, the United States Environmental Protection Agency (USEPA) established nonregulatory drinking water Health Advisories of 70 ng/L for individual and total concentrations of perfluorooctanoic acid (PFOA) and perfluorooctane sulfonate (PFOS), the 8-carbon perfluoroalkyl acids (PFAAs) that are the most thoroughly studied PFAS. As of May 2020, 9 US states had concluded that the USEPA Health Advisories are insufficiently protective and developed more stringent PFOA and PFOS guidelines. In addition, 10 states had developed guidelines for other PFAS, primarily PFAAs. This Critical Review discusses the scientific basis for state and USEPA drinking water guidelines for PFOA and PFOS; the same principles apply to guidelines for other PFAS. Similarities and differences among guidelines arise from both toxicity and exposure considerations. The approximately 4-fold range among state guidelines (8-35 ng/L for PFOA, 10-40 ng/L for PFOS) is not large or unexpected for guidelines developed by different scientists at different time points, especially when compared with older USEPA and state guidelines that were generally several orders of magnitude higher. Additional state guidelines for PFOA, PFOS, and other PFAS are expected to become available. Environ Toxicol Chem 2020;00:1-14. (c) 2020 SETAC


Trust in Drinking Water Quality: Understanding the Role of Risk Perception and Transparency

By: Brouwer, StijnHofman-Caris, Robertavan Aalderen, Nicolien

WATER   Volume: ‏ 12   Issue: ‏ 9     Article Number: 2608   Published: ‏ SEP 2020

Free Full Text from Publisher

Close Abstract

In the context of an increasing societal demand for transparency in parallel with rapidly increasing numbers and concentrations of substances found in drinking water, this paper investigates how different drinking water customers perceive their tap water quality, and possible risks involved. Empirically, the paper draws on results from a representative survey, a series of interviews and focus groups conducted in the Netherlands, applying both a traditional and modern segmentation approach based on four types of perspectives (“aware and committed”, “down to earth and confident”, “egalitarian and solidary”, and “quality and health concerned”). Although in general it was found that people’s trust in tap water is high, certain groups are more concerned about water quality and health effects than others. It was shown that transparency and the availability of more information about water treatment and quality would contribute to increasing customer trust. It was also observed that, at least in the Netherlands, people have a larger trust in drinking water companies than in other institutions. Therefore, instead of referring to standards made by other institutions, it is recommended that water companies themselves provide information on water quality and emphasize their treatment procedures.


Unveiling complex responses at the molecular level: Transcriptional alterations by mixtures of bisphenol A, octocrylene, and 2′-ethylhexyl 4- (dimethylamino)benzoate on Chironomus riparius


Living organisms are exposed to mixtures of pollutants in the wild. Inland aquatic ecosystems contain many compounds from different sources that pollute the water column and the sediment. However, majority of toxicological research is focused on the effects of single exposures to toxicants. Furthermore, studies have been principally oriented toward ecologically relevant effects of intoxication, and lack an analysis of the cellular and molecular mechanisms involved in the response to toxicants. Effects of single, binary, and ternary mixtures of three compounds, bisphenol A, octocrylene, and 2′-ethylhexyl 4- (dimethylamino)benzoate, were assessed using a Real-Time PCR array. Forty genes, and additional six reference genes, were included in the array. The genes were selected based on their association with hormone responses, detoxification mechanisms, the stress response, DNA repair, and the immune system. The study was performed on Chironomus riparius, a benthic dipteran with an essential role in the food web. Transcriptional responses were assessed both 24 and 96 h post-exposure, to determinate short- and medium-term cellular responses. Individual fourth instar larvae were exposed to 0.1 and 1 mg/L of each of the toxic compounds and compound mixtures. A weak response was detected at 24 h, which was stronger in larvae exposed to mixtures than to individual toxicants. The response at 96 h was complex and principally involved genes related to the endocrine system, detoxification mechanisms, and the stress response. Furthermore, exposure to mixtures of compounds altered the expression patterns of an increased number of genes than did individual compound exposures, which suggested complex interactions between compounds affected the regulation of transcriptional activity. The results obtained highlight the importance of analyzing the mechanisms involved in the response to mixtures of compounds over extended periods and offer new insights into the basis of the physiological responses to pollution.

Keywords: Aquatic insect; Chironomids; Invertebrates; Mixture toxicity; Multi-stress; Transcriptional alterations.

Unravelling the composition of tap and mineral water microbiota: Divergences between next-generation sequencing techniques and culture-based methods


The complex and highly diverse microbial environment of drinking water, consisting mainly of bacteria at different metabolic states, is still underexplored. The aim of this work was to characterize the bacterial communities in tap water and bottled mineral water, the two predominant sources of drinking water in modern societies. A total of 11 tap water samples from a range of locations and distribution networks and 10 brands of bottled natural mineral water were analysed using two approaches: a) heterotrophic plate counts by matrix-assisted laser desorption/ionization time of flight mass-spectrometry (MALDI-TOF MS) for the culturable heterotrophic communities, and b) Illumina amplicon sequencing for total bacteria including non-culturable bacteria. Culturable heterotrophic bacteria were isolated in WPCA (ISO) agar at 22 ± 2 °C for 72 h and 2046 isolates were identified using MALDI-TOF MS. The Bruker Daltonics Library and a previously customized library (Drinking Water Library) were used as reference databases. For the total bacteria fraction, DNA was extracted from 6 L of water and submitted to Illumina 16S rRNA sequencing of the v4 region. Significant differences were observed between mineral and tap water, with a general dominance of Alphaproteobacteria (mainly the genus Blastomonas) in tap water and Gammaproteobacteria in mineral water with Acidovorax being the dominant genus in 3 out of 7 mineral water brands. The bacterial communities in the different brands of mineral water were highly diverse and characteristic of each one. Moreover, the season in which the water was bottled also affected the species distribution, with some of them identified in only one season. Among the culturable bacteria, the most abundant phylum was Proteobacteria (around 85% of the isolates), followed by Actinobacteria, Firmicutes and Bacteroidetes. Proteobacteria was also the most abundant phylum detected with Illumina sequencing (>99% of the reads). The two methods gave distinct results at the different taxonomic levels and could therefore have a complimentary application in the study of microbiota in mineral water environments. MALDI-TOF MS is a promising method for the rapid identification of heterotrophic bacteria in routine water analysis in the bottling industry. SIGNIFICANCE AND IMPACT OF THE STUDY: The complementarity of MALDI-TOF MS and NGS in the assessment of bacterial community diversity has been demonstrated in water intended for human consumption. The two methods are suitable for routine use in the water industry for water quality management.

Keywords: 16S rRNA sequencing; Drinking water; MALDI-TOF mass spectrometry; Microbiota; Mineral water; Tap water.


Emerging contaminants affect the microbiome of water systems—strategies for their mitigation

By: Gomes, Ines B.Maillard, Jean-YvesSimoes, Lucia C.; et al.

NPJ CLEAN WATER   Volume: ‏ 3   Issue: ‏ 1     Article Number: 39   Published: ‏ SEP 18 2020

The presence of emerging contaminants (ECs) in the environment has been consistently recognized as a worldwide concern. ECs may be defined as chemicals or materials found in the environment at trace concentrations with potential, perceived, or real risk to the “One Health” trilogy (environment, human, and animal health). The main concern regarding pharmaceuticals and in particular antibiotics is the widespread dissemination of antimicrobial resistance. Nevertheless, non-antimicrobials also interact with microorganisms in both bulk phase and in biofilms. In fact, drugs not developed for antimicrobial chemotherapy can exert an antimicrobial action and, therefore, a selective pressure on microorganisms. This review aims to provide answers to questions typically ignored in epidemiological and environmental monitoring studies with a focus on water systems, particularly drinking water (DW): Do ECs exposure changes the behavior of environmental microorganisms? May non-antibiotic ECs affect tolerance to antimicrobials? Do ECs interfere with biofilm function? Are ECs-induced changes in microbial behavior of public health concern? Nowadays, the answers to these questions are still very limited. However, this study demonstrates that some ECs have significant effects in microbial behavior. The most studied ECs are pharmaceuticals, particularly antibiotics, carbamazepine and diclofenac. The pressure caused by antibiotic and other antimicrobial agents on the acquisition and spread of antibiotic resistance seems to be unquestionable. However, regarding the effects of ECs on the development and behavior of biofilms, the conclusions of different studies are still controversial. The dissimilar findings propose that standardized tests are needed for an accurate assessment on the effects of ECs in the microbiome of water systems. The variability of experimental conditions, combined with the presence of mixtures of ECs as well as the lack of information about the effects of non-pharmaceutical ECs constitute the main challenge to be overcome in order to improve ECs prioritization.


Children drinking private well water have higher blood lead than those with city water


Although the Flint, Michigan, water crisis renewed concerns about lead (Pb) in city drinking water, little attention has been paid to Pb in private wells, which provide drinking water for 13% of the US population. This study evaluates the risk of Pb exposure in children in households relying on private wells. It is based on a curated dataset of blood Pb records from 59,483 North Carolina children matched with household water source information. We analyze the dataset for statistical associations between children’s blood Pb and household drinking water source. The analysis shows that children in homes relying on private wells have 25% increased odds (95% CI 6.2 to 48%, P < 0.01) of elevated blood Pb, compared with children in houses served by a community water system that is regulated under the Safe Drinking Water Act. This increased Pb exposure is likely a result of corrosion of household plumbing and well components, because homes relying on private wells rarely treat their water to prevent corrosion. In contrast, corrosion control is required in regulated community water systems. These findings highlight the need for targeted outreach to prevent Pb exposure for the 42.5 million Americans depending on private wells for their drinking water.

Keywords: blood lead; children’s health; drinking water; lead exposure; private well.


Drinking Water in the United States: Implications of Water Safety, Access, and Consumption


Recent water quality crises in the United States, and recognition of the health importance of drinking water in lieu of sugar-sweetened beverages, have raised interest in water safety, access, and consumption. This review uses a socioecological lens to examine these topics across the life course. We review water intakes in the United States relative to requirements, including variation by age and race/ethnicity. We describe US regulations that seek to ensure that drinking water is safe to consume for most Americans and discuss strategies to reduce drinking water exposure to lead, a high-profile regulated drinking water contaminant. We discuss programs, policies, and environmental interventions that foster effective drinking water access, a concept that encompasses key elements needed to improve water intake. We conclude with recommendations for research, policies, regulations, and practices needed to ensure optimal water intake by all in the United States and elsewhere.

Keywords: access; consumption; drinking water; life course; policy; water quality.


Genotoxicity of source, treated and distributed water from four drinking water treatment plants supplied by surface water in Sardinia, Italy



High levels of disinfection by-products (DBPs) are constantly found in drinking water distributed in Sardinia, an Italian island with a tourist vocation and critical issues related to the drinking water supply. To reduce the concentration of trihalomethanes the disinfectant in use was changed – chlorine dioxide was adopted instead of hypochlorite. However, this caused the appearance of other DBPs (e.g., chlorites) in water distributed to the population. Thus, the use of monochloramine as a secondary disinfectant (associated with chlorine dioxide as the primary disinfectant) was evaluated in four drinking water treatment plants supplied by artificial basins located in the central-northern part of Sardinia. Raw, disinfected and distributed waters were studied for genotoxicity using a battery of in vitro tests on different cells (bacteria, plant and mammalian cells) to detect different genetic endpoints (i.e., point and chromosome mutations and DNA damage). Moreover, a chemical and microbiological characterisation of water samples was performed. All samples of water distributed to the people showed mutagenic or genotoxic effects in different cells/organisms. In particular, chromosome aberrations in plant cells and DNA damage in human cells were observed. In this study, the use of chloramines associated with other disinfectants did not eliminate the mutagenicity present in the raw water and when the raw water was not mutagenic it introduced mutagenic/genotoxic substances. A careful management of drinking water is needed to reduce health hazards associated with the mutagenicity of drinking water.


Detecting community response to water quality violations using bottled water sales

Maura Allaire 1Taylor Mackay 2Shuyan Zheng 3Upmanu Lall 3 4


Drinking-water contaminants pose a risk to public health. When confronted with elevated levels of contaminants, individuals can take actions to reduce exposure. Yet, few studies address averting behavior due to impaired water, particularly in high-income countries. This is a problem of national interest, given that 9 million to 45 million people have been affected by water quality violations in each of the past 34 years. No national analysis has focused on the extent to which communities reduce exposure to contaminated drinking water. Here, we present an assessment that sheds light on how communities across the United States respond to violations of the Safe Drinking Water Act, using consumer purchases of bottled water. This study provides insight into how averting behavior differs across violation types and community demographics. We estimate the change in sales due to water quality violations, using a panel dataset of weekly sales and violation records in 2,151 counties from 2006 to 2015. Critical findings show that violations which pose an immediate health risk are associated with a 14% increase in bottled water sales. Generally, greater averting action is taken against contaminants that might pose a greater perceived health risk and that require more immediate public notification. Rural, low-income communities do not take significant averting action for elevated levels of nitrate, yet experience a higher prevalence of nitrate violations. Findings can inform improvements in public notification and targeting of technical assistance from state regulators and public health agencies in order to reduce community exposure to contaminants.


(Re)theorizing the Politics of Bottled Water: Water Insecurity in the Context of Weak Regulatory Regimes

Raul Pacheco-Vega – Public Administration Division, Centro de Investigación y Docencia Económicas (CIDE), Sede Región Centro, Aguascalientes 20313 Ciudad de México 01210, Mexico

DOI: 10.3390/w11040658


Water insecurity in developing country contexts has frequently led individuals and entire communities to shift their consumptive patterns towards bottled water. Bottled water is sometimes touted as a mechanism to enact the human right to water through distribution across drought-stricken or infrastructure-compromised communities. However, the global bottled water industry is a multi-billion dollar major business. How did we reach a point where the commodification of a human right became not only commonly accepted but even promoted? In this paper, I argue that a discussion of the politics of bottled water necessitates a re-theorization of what constitutes “the political” and how politics affects policy decisions regarding the governance of bottled water. In this article I examine bottled water as a political phenomenon that occurs not in a vacuum but in a poorly regulated context. I explore the role of weakened regulatory regimes and regulatory capture in the emergence, consolidation and, ultimately, supremacy of bottled water over network-distributed, delivered-by-a-public utility tap water. My argument uses a combined framework that interweaves notions of “the political”, ideas on regulatory capture, the concept of “the public”, branding, and regulation theory to retheorize how we conceptualize the politics of bottled water. © 2019 by the authors. Licensee MDPI, Basel, Switzerland.

Historical and Future Needs for Geospatial Iodide Occurrence in Surface and Groundwaters of the United States of America

Authors: Sharma, N; Karanfil, T; Westerhoff, P ENVIRONMENTAL SCIENCE & TECHNOLOGY LETTERS

DOI: 10.1021/acs.estlett.9b00278


While iodide (I-) is critical for biological systems, it can serve as a precursor to organic iodinated disinfection byproducts (I-DBPs) of human health concern during water treatment. Thus, understanding potential I- occurrence in fresh waters is critical. Although I- occurrence data are sparse in surface water (SW) or groundwater (GW) used for drinking water supplies, data exist for other locations. We analyzed historical I- occurrence for similar to 9200 SW and GW sampling locations in the United States to understand potential I- sources and also spatial and temporal variability. I- ranged from below detection limits (<1 mu g/L) to 95th percentile concentrations of 320 and 1300 mu g/L (median = 12 and 13 mu g/L), respectively, in SW and GW. I- appears to be influenced by halite basins, organic-rich shale/oil formations, saltwater intrusion, and rainfall, with median Br-/I- mass ratios of 10 and 17 mu g/mu g in SW and GW, respectively. Our results demonstrated considerable variability in iodine sources and speciation, which can impact I-DBP formation at WTPs. We advocate for occurrence studies to measure I- IO3- and total iodine in raw and finished drinking waters to fill critical data gaps necessary to understand the potential formation of I-DBPs that impact public health.

Occurrence, Concentrations, and Risks of Pharmaceutical Compounds in Private Wells in Central Pennsylvania

Authors: Kibuye, FA; Gall, HE; Elkin, KR; Swistock, B; Veith, TL; Watson, JE; Elliott, HA JOURNAL OF ENVIRONMENTAL QUALITY

DOI: 10.2134/jeq2018.08.0301


Over-the-counter and prescription medications are routinely present at detectable levels in surface and groundwater bodies. The presence of these emerging contaminants has raised both environmental and public health concerns, particularly when the water is used for drinking either directly or with additional treatment. However, the frequency of occurrence, range of concentrations, and potential human health risks are not well understood, especially for groundwater supplies. Private wells are often not tested for contaminants regulated by drinking water standards and are even less frequently tested for emerging contaminants. By partnering with the Pennsylvania Master Well Owner Network, water samples were collected from 26 households with private wells in the West Branch of the Susquehanna River basin in central Pennsylvania in winter 2017. All samples were analyzed for six pharmaceuticals (acetaminophen, ampicillin, naproxen, ofloxacin, sulfamethoxazole, and trimethoprim) and one over-the-counter stimulant (caffeine). At least one compound was detected at each site. Ofloxacin and naproxen were the most and least frequently detected compounds, respectively. Concentrations from the groundwater wells were higher than those of nearby surface water samples. However, risk calculations revealed that none of the concentrations measured in groundwater samples posed significant human health risk. A simple, physicochemical-based modeling approach was used to predict pharmaceutical transport from septic absorption field to groundwater and further elucidate variations in detection frequencies. Findings indicate that although septic tanks may act as contaminant sources for groundwater wells, the human health impacts from trace-level pharmaceuticals that may be present are likely minimal.

Exploring the Efficacy of Nile Red in Microplastic Quantification: A Costaining Approach.

Stanton, T., Johnson, M., Nathanail, P., Gomes, R.L., Needham, T., Burson, A., 2019.

ABSTRACT: The presence of microplastic particles (<5 mm) in the environment has generated considerable concern across public, political, and scientific platforms. However, the diversity of microplastics that persist in the environment poses complex analytical challenges for our understanding of their prevalence. The use of the dye Nile red to quantify microplastics is increasingly common. However, its use in microplastic analysis rarely accounts for its affinity with the breadth of particles that occur in environmental samples. Here, we examine Nile red’s ability to stain a variety of microplastic particles and common natural and anthropogenic particles found in environmental samples. To better constrain microplastic estimates using Nile red, we test the coapplication of a second stain that binds to biological material, 4′,6-diamidino-2-phenylindole (DAPI). We test the potential inflation of microplastic estimates using Nile red alone by applying this costaining approach to samples of drinking water and freshwater. The use of Nile red dye alone resulted in a maximum 100% overestimation of microplastic particles. These findings are of particular significance for the public dissemination of findings from an emotive field of study.


Assessing the threats of organophosphate esters (flame retardants and plasticizers) to drinking water safety based on USEPA oral reference dose (RfD) and oral cancer slope factor (SFO)



As one group of emerging pollutants, the threat of organophosphate esters (flame retardants and plasticizers, OPEs) to drinking water safety is not well recognized. Now, the oral reference dose (RfD) and oral cancer slope factor (SFO) of OPEs have been updated by USPEA, therefore the threat of OPEs to drinking water safety could be assessed. In this study, occurrence, health risk and key impact factor of OPEs in drinking water of China were analyzed covering 79 cities, whose population and gross domestic product (GDP) accounted for 28.8% and 44.1% of them in China, respectively. Total concentration of 14 common OPEs in drinking water was 13.42–265.48 ng/L. The exposure level of OPEs via ingestion of drinking water was much lower than that of food ingestion but was comparable with dust ingestion, inhalation and dermal absorption. A health assessment for OPEs via ingestion of drinking water suggested that the potential cancer risk occurred (>1.00E-6) but no obvious non-carcinogenic effects occurred (<1). Tris-(2,3-dibromopropyl) phosphate(TDBPP) contributed to about 72.4% of carcinogenic risk, which should be treated as “prior monitoring OPEs” in further studies. The occurrence and distribution of OPEs in drinking water of China have a good corresponding relationship with the Aihui-Tengchong Line, and drinking water treatment technology (DWTT) was found to be a key factor. Total OPEs, halogeno-OPEs and alkyl-OPEs in drinking water from advanced DWTT cities were much lower than those of conventional DWTT cities. Compared with conventional DWTT, advanced DWTT could reduce about 65.6% and 36.5% of carcinogenic risk and non-carcinogenic risk of OPEs, respectively. Considering the annual growth of OPEs consumption in China and world, further studies regarding the environmental threat of OPEs are required.


Presence of antibiotics in the aquatic environment in Europe and their analytical monitoring: Recent trends and perspectives

UrszulaSzymańskaaMarekWiergowskibIreneuszSołtyszewskicJarosławKuzemkodGabrielaWiergowskaefMateusz KacperWoźniak


The presence of antibiotics and their metabolites in the aquatic environment exerts a negative impact on all organisms. Moreover, the easy migration of these substances to drinking water may also have serious consequences for public health, such as drug resistance. Although antibiotics and their metabolites are detected in surface waters and wastewater, there are still no systemic solutions preventing environmental pollution with these substances. The procedure for quantification of antibiotics usually involves solid-phase extraction (SPE) followed by instrumental analysis typically using liquid chromatography coupled with tandem mass spectrometry (LC–MS/MS), which provides sensitivity, selectivity and reliability of results. Therefore, it is necessary to take decisive steps aimed at the determination of critical concentrations of antibiotics, which will make it possible to maintain safe values that will not exert a negative impact on the natural environment and human health. This work presents the current state of knowledge based on data from 2009 to 2018 (review of ten years of scientific papers) on the presence of antibiotics and their metabolites in the aquatic environment in Poland and Europe and methods used for the determination of antibiotics in different types of water (surface water and wastewater). The main strategies used for the removal of antibiotics during wastewater treatment processes in the context of antibiotics’ concentrations were also presented.


Prenatal exposure to PFOS and PFOA in a pregnant women cohort of Catalonia, Spain

By:Rovira, J (Rovira, Joaquim)1,2 ] ; Martinez, MA (Angeles Martinez, Maria)1 ] ; Sharma, RP (Prasad Sharma, Raju)1 ] ; Espuis, T (Espuis, Teresa)1 ] ; Nadal, M (Nadal, Marti)2 ] ; Kumar, V (Kumar, Vikas)1,2 ] ; Costopoulou, D (Costopoulou, Danae)3 ] ; Vassiliadou, I (Vassiliadou, Irene)3 ] ; Leondiadis, L (Leondiadis, Leondios)3 ] ; Domingo, JL (Domingo, Jose L.)2 ] …More


Volume: 175

Pages: 384-392

DOI: 10.1016/j.envres.2019.05.040

Published: AUG 2019

Document Type:Article


This study was aimed at assessing the prenatal exposure to perfluorooctane sulfonic acid (PFOS) and perfluorooctanoic acid (PFOA) in a cohort of pregnant women living in Reus (Tarragona County, Catalonia, Spain). These chemicals were biomonitored in maternal plasma during the first trimester of pregnancy, at delivery, and in cord blood. The dietary exposure of PFOS and PFOA was estimated by using questionnaires of food frequency and water intake, as well as data on food levels previously reported in the same area. In addition, the exposure through air inhalation and indoor dust ingestion was also calculated. Finally, a physiologically-based pharmacokinetic (PBPK) model was applied in order to establish the prenatal exposure of the fetus/child and to adjust exposure assessment vs. biomonitoring results. Probabilistic calculations of fetal exposure were performed by forward internal dosimetry and Monte-Carlo simulation. Mean plasma levels of PFOA were 0.45, 0.13 and 0.12 ng/mL at the first trimester, at delivery and in cord plasma, while those of PFOS were 2.93, 2.21, and 1.17 ng/mL, respectively. Traces of PFOS were found in all samples in the trimester and at delivery, and almost in all cord blood samples. Transplacental transfers of PFOS and PFOA were estimated to be around 70% and 60%, respectively. A temporal decrease trend in plasma levels of PFOS and PFOA was noticed, when comparing current values with data obtained 10 years ago in the same area. In agreement with many other studies, dietary intake was the main route of exposure to PFOS and PFOA in our cohort of pregnant women. It is an important issue to establish the exposure in critical windows periods such as fetal development to perfluoroalkylated substances, but also to other endocrine disrupting chemicals.

A Novel Method to Characterise Levels of Pharmaceutical Pollution in Large-Scale Aquatic Monitoring Campaigns

By:Wilkinson, JL (Wilkinson, John L.)1 ] ; Boxall, ABA (Boxall, Alistair B. A.)1 ] ; Kolpin, DW (Kolpin, Dana W.)2 ]

View ResearcherID and ORCID


Volume: 9

Issue: 7

Article Number: 1368

DOI: 10.3390/app9071368

Published: APR 1 2019

Document Type:Article


Much of the current understanding of pharmaceutical pollution in the aquatic environment is based on research conducted in Europe, North America and other select high-income nations. One reason for this geographic disparity of data globally is the high cost and analytical intensity of the research, limiting accessibility to necessary equipment. To reduce the impact of such disparities, we present a novel method to support large-scale monitoring campaigns of pharmaceuticals at different geographical scales. The approach employs the use of a miniaturised sampling and shipping approach with a high throughput and fully validated direct-injection High-Performance Liquid Chromatography-Tandem Mass Spectrometry method for the quantification of 61 active pharmaceutical ingredients (APIs) and their metabolites in tap, surface, wastewater treatment plant (WWTP) influent and WWTP effluent water collected globally. A 7-day simulated shipping and sample stability assessment was undertaken demonstrating no significant degradation over the 1-3 days which is typical for global express shipping. Linearity (r(2)) was consistently 0.93 (median = 0.99 +/- 0.02), relative standard deviation of intra- and inter-day repeatability and precision was <20% for 75% and 68% of the determinations made at three concentrations, respectively, and recovery from Liquid Chromatography Mass Spectrometry grade water, tap water, surface water and WWTP effluent were within an acceptable range of 60-130% for 87%, 76%, 77% and 63% of determination made at three concentrations respectively. Limits of detection and quantification were determined in all validated matrices and were consistently in the ng/L level needed for environmentally relevant API research. Independent validation of method results was obtained via an interlaboratory comparison of three surface-water samples and one WWTP effluent sample collected in North Liberty, Iowa (USA). Samples used for the interlaboratory validation were analysed at the University of York Centre of Excellence in Mass Spectrometry (York, UK) and the U.S. Geological Survey National Water Quality Laboratory in Denver (Colorado, USA). These results document the robustness of using this method on a global scale. Such application of this method would essentially eliminate the interlaboratory analytical variability typical of such large-scale datasets where multiple methods were used.



Dietary intake, drinking water ingestion and plasma perfluoroalkyl substances concentration in reproductive aged Chinese women

By:Zhou, W (Zhou, Wei)1,2 ] ; Zhao, SS (Zhao, Shasha)1 ] ; Tong, CL (Tong, Chuanliang)3 ] ; Chen, L (Chen, Lin)1 ] ; Yu, XD (Yu, Xiaodan)4 ] ; Yuan, T (Yuan, Tao)5 ] ; Aimuzi, R (Aimuzi, Ruxianguli)1,6 ] ; Luo, F (Luo, Fei)1,6 ] ; Tian, Y (Tian, Ying)1,6 ] ; Zhang, J (Zhang, Jun)1,6 ]

Group Author(s):Shanghai Birth Cohort Study

View ResearcherID and ORCID


Volume: 127

Pages: 487-494

DOI: 10.1016/j.envint.2019.03.075

Published: JUN 2019

Document Type:Article


Background: Perfluoroalkyl and polyfluoroalkyl substances (PFAS) are a group of synthetic chemicals that are widely used in industrial and consumer products. A growing body of literature suggests that exposure to these chemicals are associated with adverse reproductive outcomes in women. However, the sources of PFAS exposure are often poorly characterized in women of child-bearing age.


Exposure to Contaminants Among Private Well Users in North Carolina: Enhancing the Role of Public Health

Crystal Lee Pow Jackson, PhD, North Carolina Department of Health and Human Services

Max Zarate-Bermudez, MSc, MPH, PhD, Centers for Disease Control and Prevention


North Carolina has the second highest number of residents who rely on private wells for their drinking water supply. Studies report that about 3.3 million North Carolina residents (35% of the population) use private wells, with the highest county having 85.4% of the residents using private wells. Unlike public water systems that benefit from the regulatory safeguards of the Safe Drinking Water Act, there are no federal regulations for private wells in the U.S. Testing, treating, maintaining, and managing private wells are up to well owners, often with little to no technical or financial support.

In 2015, the Private Well and Health Program (PWHP) of the North Carolina Department of Health and Human Services received funding from the Centers for Disease Control and Prevention’s Safe Water for Community Health (Safe WATCH) Program to enhance services to private well users. PWHP was understaffed, had limited access to water quality data, and lacked established partnerships, which prevented it from enhancing services for private well users to better protect their health.

This month’s column highlights how PWHP used the funding to address vulnerabilities in its private wells and water quality, as well as initiatives to close the gaps in ensuring safe drinking water for its residents.



Legionella growth potential of drinking water produced by a reverse osmosis pilot plant

K.L.G.LearbuchaM.C.LutbG.LiubcdH.SmidteP.W.J.J.van der Wielen


Treatment processes, such as membrane filtration with reverse osmosis (RO), are used to produce drinking water with a high degree of biostability. To our knowledge, the influence of RO water on biofilm formation and growth of L. pneumophila has not yet been investigated. Therefore, this study aimed (i) to determine the Legionella growth potential of (remineralised) RO-water produced by a pilot plant and to compare this to conventional treated groundwater, and (ii) to determine if different pipe materials, in contact with remineralised RO-water, can cause growth of L. pneumophila. The Legionella growth potential of water was determined with the boiler biofilm monitor (BBM) that mimics the flow of water in a premise plumbing system. The Legionella growth potential of materials in contact with remineralised RO-water was determined by using the biomass production potential (BPP)-test. ATP concentrations in the biofilm on the glass rings from the BBM fed with (remineralised) RO water fluctuated around 100 pg ATP cm−2. In contrast, BBMs fed with conventionally treated water resulted in ten-fold higher ATP concentrations in the biofilm. Moreover, conventionally treated water had a Legionella growth potential that was 1000-fold higher than that of (remineralised) RO-water. Furthermore, glass, copper and PVC-C had the lowest biofilm concentrations and Legionella growth potential in the BPP-test, followed by PE-Xb, PE-Xc and PE-100. The highest biofilm concentration and Legionella growth potential were with PVC-P. Hence, our study demonstrated that remineralised RO-water did not enhance growth of L. pneumophila in the BBM that mimics the premises plumbing system. However, when PE or PVC-P materials are used growth of L. pneumophila can still occur in the premises plumbing system despite the high quality of the supplied remineralised RO-water.


Non-tuberculous mycobacteria in drinking water systems: A review of prevalence data and control means

By:Loret, JF (Loret, Jean-Francois)1 ] ; Dumoutier, N (Dumoutier, Nadine)1 ]


Volume: 222

Issue: 4

Pages: 628-634

Special Issue: SI

DOI: 10.1016/j.ijheh.2019.01.002

Published: MAY 2019

Document Type:Review


Non-tuberculous species of Mycobacterium are commonly found in a large diversity of water environments, and epidemiological studies suggest that natural or drinking waters are the principal sources of human contamination. Controlling non-tuberculous mycobacteria in water systems is therefore important to prevent infection with these micro-organisms. This review article summarizes the information and data published up to now on the factors favoring the presence of these bacteria in natural and artificial water systems, the effectiveness of water treatment means, and based on this information, identifies possible means to control the presence of non-tuberculous mycobacteria in drinking water.


Inactivation of Adenovirus in Water by Natural and Synthetic Compounds


  • Lucas Ariel Totaro GarciaEmail author
  • Laurita Boff
  • Célia Regina Monte Barardi
  • Markus Nagl
Original Paper


Millions of people use contaminated water sources for direct consumption. Chlorine is the most widely disinfection product but can produce toxic by-products. In this context, natural and synthetic compounds can be an alternative to water disinfection. Therefore, the aim of this study was to assess the inactivation of human adenovirus by N-chlorotaurine (NCT), bromamine-T (BAT) and Grape seed extract (GSE) in water. Distilled water artificially contaminated with recombinant human adenovirus type 5 (rAdV-GFP) was treated with different concentrations of each compound for up to 120 min, and viral infectivity was assessed by fluorescence microscopy. The decrease in activity of the compounds in the presence of organic matter was evaluated in water supplemented with peptone. As results, NCT and GSE inactivated approximately 2.5 log10 of adenovirus after 120 min. With BAT, more than 4.0 log10decrease was observed within 10 min. The oxidative activity of 1% BAT decreased by 50% in 0.5% peptone within a few minutes, while the reduction was only 30% for 1% NCT in 5% peptone after 60 min. Organic matter had no effect on the activity of GSE. Moreover, the minimal concentration of BAT and GSE to kill viruses was lower than that known to kill human cells. It was concluded that the three compounds have potential to be used for water disinfection for drinking or reuse purposes.


QMRA of adenovirus in drinking water at a drinking water treatment plant using UV and chlorine dioxide disinfection

QMRA of adenovirus in drinking water at a drinking water treatment plant using UV and chlorine dioxide disinfection

By:Schijven, J (Schijven, Jack)1,2 ] ; Teunis, P (Teunis, Peter)3 ] ; Suylen, T (Suylen, Trudy)4 ] ; Ketelaars, H (Ketelaars, Henk)4 ] ; Hornstra, L (Hornstra, Luc)5 ] ; Rutjes, S (Rutjes, Saskia)1 ]


Volume: 158

Pages: 34-45

DOI: 10.1016/j.watres.2019.03.090

Published: JUL 1 2019

Document Type:Article


According to the Dutch Drinking Water Act of 2011, Dutch drinking water suppliers must conduct a Quantitative Microbial Risk Assessment (QMRA) for infection by the following index pathogens: enterovirus, Campylobacter, Cryptosporidium and Giardia at least once every four years in order to assess the microbial safety of drinking water. The health-based target for safe drinking water is set at less than one infection per 10 000 persons per year. At Evides Water Company, concern has arisen whether their drinking water treatment, mainly based on UV inactivation and chlorine dioxide, reduces levels of adenovirus (AdV) sufficiently. The main objective was, therefore, to conduct a QMRA for AdV. Estimates of the MV concentrations in source water were based on enumeration of total AdV by integrated cell culture PCR (iccPCR), most probable number PCR (mpnPCR) and quantitative PCR (qPCR), and on enumeration of AdV40/41 by mpnPCR and qPCR. AdV40/41 represents a large fraction of total AdV and only a small fraction of AdV is infectious (1/1700). By comparison of literature data and plant scale data, somatic coliphages appeared a good, conservative indicator for AdV disinfection by UV irradiation. Similarly, bacteriophage MS2 appeared to be a good, conservative indicator for disinfection by chlorine dioxide. Literature data on the efficiency of chlorine dioxide disinfection were fitted with the extended HOM model. Chlorine dioxide disinfection at low initial concentrations (0.05-0.1 mg/l) was found to be the major treatment step, providing sufficient treatment on its own for compliance with the health based target. UV disinfection of AdV at 40 mJ/cm(2) or 73 mJ/cm(2) was insufficient without chlorine dioxide disinfection. (C) 2019 Published by Elsevier Ltd.


Aquatic risks from human pharmaceuticals-modelling temporal trends of carbamazepine and ciprofloxacin at the global scale

By:Oldenkamp, R (Oldenkamp, Rik)1,2 ] ; Beusen, AHW (Beusen, Arthur H. W.)3,4 ] ; Huijbregts, MAJ (Huijbregts, Mark A. J.)1,3 ]

View ResearcherID and ORCID


Volume: 14

Issue: 3

Article Number: 034003

DOI: 10.1088/1748-9326/ab0071

Published: MAR 2019

Document Type:Article


Despite the worldwide presence of pharmaceuticals in the aquatic environment, a comprehensive picture of their aquatic risk (AR) at the global scale has not yet been produced. Here, we present a procedure to estimate ARs of human pharmaceuticals at a freshwater ecoregion level. First, we predicted country- and year-specific per capita consumption with a regression model. Second, we calculated spatially explicit freshwater concentrations via a combination of mass balance models, addressing the pharmaceutical’s fate in respectively humans, wastewater treatment plants and the environment. Finally, we divided the freshwater concentrations at the level of individual freshwater ecoregions with the regulatory limit value derived from toxicity tests to come to an ecoregion-specific AR. We applied our procedure to model time-trends (1995-2015) of ARs of carbamazepine and ciprofloxacin, two widely detected and regulatory relevant human use pharmaceuticals. Our analysis of carbamazepine and ciprofloxacin showed that ARs, due to exposure to these human pharmaceuticals, typically increased 10-20 fold over the last 20 years. Risks due to carbamazepine exposure were still typically low for the time period assessed (AR < 0.1), although some more densely populated and/or arid ecoregions showed higher ARs (up to 1.1). Risks for ciprofloxacin were found to be much higher with ARs larger than 1 for 223 out of 449 freshwater ecoregions in 2015. Comparison with measured concentrations in ten river basins showed that carbamazepine concentrations were predicted well. Concentrations of ciprofloxacin, measured in four river basins, were, however, generally underestimated by our model with one to two orders of magnitude. We conclude that our procedure provides a good starting point to evaluate ARs of a wide range of human pharmaceuticals at the global scale.


Migration and potential risk of trace phthalates in bottled water: A global situation



Increasing attention has been dedicated to trace phthalates in bottled water due to the serious concerns on public health, while there is still a lack of systematic analysis and assessment of current global situation. Through analyzing five representative phthalates in bottled water over 20 countries, this work clearly revealed the phthalates-associated potential risks in both human daily intake and estrogenic effect. In the risk assessment, the kinetic models were also developed to describe and predict phthalates migration. In more than three hundred brands of bottled waters from twenty one countries, the detection frequency of the five targeted phthalates was found to be in the order of dibutyl phthalate (DBP, 67.6%), di-2-(ethyl hexyl) phthalate (DEHP, 61.7%), diethyl phthalate (DEP, 47.1%), benzyl butyl phthalate (BBP, 36.9%), and dimethyl phthalate (DMP, 30.1%). Among the countries studied relating concentrations of DEHP in bottled waters, the top five countries ranked in the order of high to low were ThailandCroatiaCzech RepublicSaudi Arabia and China with an average level of 61.1, 8.8, 6.3, 6.2 and 6.1 μg/L, respectively. The average levels of BBP, DBP, DMP and DEP in bottled water from Pakistan were high, in which DEP and DMP were ranked 1st among all countries with the average levels of 22.4 and 50.2 μg/L, while BBP and DBP were ranked 2nd and 3rd with the average levels of 7.5 and 17.8 μg/L, respectively. The human daily intake-based risk assessment revealed that phthalates in bottled waters studied would not pose a serious concern on public health. However, the adverse estrogenic effects of phthalates in bottled water from some countries appeared to be significant. This study just shed light on global situation of phthalates in bottled water, and more efforts should be needed to systematically examine the phthalates-related safety of bottled water.


Profiling of intracellular and extracellular antibiotic resistance genes in tap water



Antibiotic resistance genes (ARGs) have gained global attention due to their public health threat. Extracelluar ARGs (eARGs) can result in the dissemination of antibiotic resistance via free-living ARGs in natural environments, where they promote ARB transmission in drinking water distribution systems. However, eARG pollution in tap water has not been well researched. In this study, concentrations of eARGs and intracellular ARGs (iARGs) in tap water, sampled at Tianjin, China, were investigated for one year. Fourteen eARG types were found at the highest concentration of 1.3 × 105 gene copies (GC)/L. TetC was detected in 66.7% of samples, followed by sul1, sul2, and qnrA with the same detection frequency of 41.7%. Fifteen iARGs (including tetA, tetB, tetM, tetQ, tetX, sul1, sul2, sul3, ermB, blaTEM, and qnrA)were continuously detected in all collected tap water samples with sul1 and sul2 the most abundant. Additionally, both eARG and iARG concentrations in tap water presented a seasonal pattern with most abundant prevalence in summer. The concentration of observed intracellular sulfonamide resistance genes showed a significantly positive correlation with total nitrogen concentrations. This study suggested that eARG and iARG pollution of drinking water systems pose a potential risk to human public health.


Perfluoroalkyl acids in drinking water of China in 2017: Distribution characteristics, influencing factors and potential risks

By:Li, YN (Li, Yuna)1 ] ; Li, JF (Li, Jiafu)2 ] ; Zhang, LF (Zhang, Lifen)1 ] ; Huang, ZP (Huang, Zhiping)1 ] ; Liu, YQ (Liu, Yunqing)3 ] ; Wu, N (Wu, Nan)3 ] ; He, JH (He, Jiahui)2 ] ; Zhang, ZZ (Zhang, Zhaozhao)2 ] ; Zhang, Y (Zhang, Ying)1 ] ; Niu, ZG (Niu, Zhiguang)2,3 ]

View ResearcherID and ORCID


Volume: 123

Pages: 87-95

DOI: 10.1016/j.envint.2018.11.036

Published: FEB 2019

Document Type:Article


Perfluoroalkyl acids (PFAAs) are a group of emerging persistent organic pollutants (POPs), which have been ubiquitously detected in the environmental media. However, national scale investigations on their occurrence and distribution in drinking water are still insufficient. In this study, we detected the 17 priority PFAAs in drinking water from 79 cities of 31 provincial-level administrative regions throughout China, and investigated their occurrence and distribution. Additionally, we also analyzed the influencing factors on their profiles, such as the existence of industrial sources, socioeconomic factors (population density and GDP), and assessed levels of risk associated with contaminated drinking water. On the national scale, the sum concentrations of the 17 PFAAs (Sigma(17)PFAAs) in drinking water was in a range of 4.49-174.93 ng/L with a mean value of 35.13 ng/L. Among the 17 individual PFAAs, perfluorobutanoic acids (PFBA) was the most abundant individual PFAAs with the median concentration of 17.87 ng/L, followed by perfluorooctanoic acid (PFOA, 0.74 ng/L), perfluorononanoic acid (PFNA, 0.40 ng/L) and perfluorooctane sulfonic acid (PFOS, 0.25 ng/L). The geographic distribution characteristic of Sigma(17)PFAAs in drinking water was in a descending order of Southwestern China (57.67 ng/L) > Eastern coastal China (32.85 ng/L) > Middle China (29.89 ng/L) > Northwestern China (28.49 ng/L) > Northeastern China (22.03 ng/L), and in general, the existence of the industrial sources could positively affect the contamination levels of PFAAs in drinking water. The pollution level of PFAAs in drinking water also varied among the three different city levels (medium-sized city > big city > town). In towns, the positive correlations were observed between the population density and the Sigma(17)PFAAs (R-2 = 0.45, p < 0.01), and the individual concentration of PFHxA, PFBS, and PFOA (p < 0.01). Moreover, besides PFAAs in Yunnan, Jiangsu, and Jiangxi, concentrations of related PFAAs in drinking water from 28 provinces were less than the suggested drinking water advisories. The relatively higher concentrations of PFAAs in Yunnan, Jiangsu, and Jiangxi suggest that further studies focusing on their sources and potential health risk to humans are needed.


Transformation of endocrine disrupting chemicals, pharmaceutical and personal care products during drinking water disinfection

By:Leusch, FDL (Leusch, Frederic D. L.)1 ] ; Neale, PA (Neale, Peta A.)1 ] ; Busetti, F (Busetti, Francesco)2,3 ] ; Card, M (Card, Marcella)4,9 ] ; Humpage, A (Humpage, Andrew)5 ] ; Orbell, JD (Orbell, John D.)6 ] ; Ridgway, HF (Ridgway, Harry F.)7 ] ; Stewart, MB (Stewart, Matthew B.)6 ] ; van de Merwe, JP (van de Merwe, Jason P.)1 ] ; Escher, BI (Escher, Beate, I)1,4,8 ]

View ResearcherID and ORCID


Volume: 657

Pages: 1480-1490

DOI: 10.1016/j.scitotenv.2018.12.106

Published: MAR 20 2019

Document Type:Article


Pharmaceuticals and personal care products (PPCPs) and endocrine disrupting compounds (EDCs) are frequently detected in drinking water sources. This raises concerns about the formation of potentially more toxic transformation products (TPs) after drinking water disinfection. This study applied a combination of computational and experimental methods to investigate the biological activity of eight EDCs and PPCPs commonly detected in source waters (acetaminophen, bisphenol A, carbamazepine, estrone, 17 alpha-ahinylestradiol, gemfibrozil, naproxen and triclosan) before and after disinfection. Using a Stepped Forced Molecular Dynamics (SFMD) method, we detected 911 unique TPs, 36% of which have been previously reported in the scientific literature. We calculated the likelihood that TPs would cause damage to biomolecules or DNA relative to the parent compound based on lipophilicity and the occurrence of structural alerts, and applied two Quantitative Structure-Activity Relationship (QSAR) tools to predict toxicity via receptor-mediated effects. In parallel, batch experiments were performed with three disinfectants, chlorine, chlorine dioxide and chloramine. After solid-phase extraction, the resulting TP mixtures were analyzed by chemical analysis and a battery of eleven in vitro bioassays covering a variety of endpoints. The laboratory results were in good agreement with the predictions. Overall, the combination of computational and experimental chemistry and toxicity methods used in this study suggest that disinfection of the studied EDCs and PPCPs will produce a large number of TPs, which are unlikely to increase specific toxicity (e.g., endocrine activity), but may result in increased reactive and non-specific toxicity. (C) 2018 Elsevier B.V. All rights reserved.


Trends in neonicotinoid pesticide residues in food and water in the United States, 1999-2015

By:Craddock, HA (Craddock, Hillary A.)1 ] ; Huang, D (Huang, Dina)2 ] ; Turner, PC (Turner, Paul C.)1 ] ; Quiros-Alcala, L (Quiros-Alcala, Lesliam)1 ] ; Payne-Sturges, DC (Payne-Sturges, Devon C.)1 ]

View ResearcherID and ORCID


Volume: 18

Article Number: 7

DOI: 10.1186/s12940-018-0441-7

Published: JAN 11 2019

Document Type:Article


BackgroundNeonicotinoids are a class of systemic insecticides widely used on food crops globally. These pesticides may be found in off-target food items and persist in the environment. Despite the potential for extensive human exposure, there are limited studies regarding the prevalence of neonicotinoid residues in foods sold and consumed in the United States.MethodsResidue data for seven neonicotinoid pesticides collected between 1999 and 2015 by the US Department of Agriculture’s Pesticide Data Program (PDP) were collated and summarized by year across various food commodities, including fruit, vegetable, meat, dairy, grain, honey, and baby food, as well as water to qualitatively describe and examine trends in contamination frequency and residue concentrations.ResultsThe highest detection frequencies (DFs) for neonicotinoids by year on all commodities were generally below 20%. Average DFs over the entire study period, 1999-2015, for domestic and imported commodities were similar at 4.5%. For all the samples (both domestic and imported) imidacloprid was the neonicotinoid with the highest overall detection frequency at 12.0%. However, higher DFs were observed for specific food commodity-neonicotinoid combinations such as: cherries (45.9%), apples (29.5%), pears (24.1%) and strawberries (21.3%) for acetamiprid; and cauliflower (57.5%), celery (20.9%), cherries (26.3%), cilantro (30.6%), grapes (28.9%), collard greens (24.9%), kale (31.4%), lettuce (45.6%), potatoes (31.2%) and spinach (38.7%) for imidacloprid. Neonicotinoids were also detected in organic commodities, (DF<6%). Individual commodities with at least 5% of samples testing positive for two or more neonicotinoids included apples, celery, and cherries. Generally, neonicotinoid residues on food commodities did not exceed US Environmental Protection Agency tolerance levels. Increases in detection trends for both finished and untreated water samples for imidacloprid were observed from 2004 to 2011.ConclusionsAnalysis of PDP data indicates that low levels of neonicotinoids are present in commonly-consumed fruits and vegetables sold in the US. Trends in detection frequencies suggest an increase in use of acetamiprid, clothianidin and thiamethoxam as replacements for imidacloprid. Given these findings, more extensive surveillance of the food and water supply is warranted, as well as biomonitoring studies and assessment of cumulative daily intake in high risk groups, including pregnant women and infants.


Moving from the traditional paradigm of pathogen inactivation to controlling antibiotic resistance in water – Role of ultraviolet irradiation

By:Umar, M (Umar, Muhammad)1 ] ; Roddick, F (Roddick, Felicity)2 ] ; Fan, LH (Fan, Linhua)2 ]

View ResearcherID and ORCID


Volume: 662

Pages: 923-939

DOI: 10.1016/j.scitotenv.2019.01.289

Published: APR 20 2019

Document Type:Review


Ultraviolet (UV) irradiation has proven an effective tool for inactivating microorganisms in water. There is, however, a need to look at disinfection from a different perspective because microbial inactivation alone may not be sufficient to ensure the microbiological safety of the treated water since pathogenic genes may still be present, even after disinfection. Antibiotic resistance genes (ARGs) are of a particular concern since they enable microorganisms to become resistant to antibiotics. UV irradiation has been widely used for disinfection and more recently for destroying ARGs. While UV lamps remain the principal technology to achieve this objective, UV light emitting diodes (UV-LEDs) are novel sources of UV irradiation and have increasingly been reported in lab-scale investigations as a potential alternative. This review discusses the current state of the applications of UV technology for controlling antibiotic resistance during water and wastewater treatment. Since UV-LEDs possess several attractive advantages over conventional UV lamps, the impact of UV-LED characteristics (single vs combined wavelengths, and operational parameters such as periodic or pulsed and continuous irradiation, pulse repetition frequencies, duty cycle), type of organism, and fluence response, are critically reviewed with a view to highlighting the research needs for addressing future disinfection challenges. The energy efficiency of the reported UV processes is also evaluated with a focus on relating the findings to disinfection efficacy. The greater experience with UV lamps could be useful for investigating UV-LEDs for similar applications (i.e., antibiotic resistance control), and hence identification of future research directions. (C) 2019 Elsevier B.V. All rights reserved.


Analysis of endocrine activity in drinking water, surface water and treated wastewater from six countries.

Water Research – Volume: 139     Pages: 10-18     Published: 2018      Publication Type: J Document type: Journal Article – DOI: 10.1016/j.watres.2018.03.056

The aquatic environment can contain numerous micropollutants and there are concerns about endocrine activity in environmental waters and the potential impacts on human and ecosystem health. In this study a complementary chemical analysis and in vitro bioassay approach was applied to evaluate endocrine activity in treated wastewater, surface water and drinking water samples from six countries (Germany, Australia, France, South Africa, the Netherlands and Spain). The bioassay test battery included assays indicative of seven endocrine pathways, while 58 different chemicals, including pesticides, pharmaceuticals and industrial compounds, were analysed by targeted chemical analysis. Endocrine activity was below the limit of quantification for most water samples, with only two of six treated wastewater samples and two of six surface water samples exhibiting estrogenic, glucocorticoid, progestagenic and/or anti-mineralocorticoid activity above the limit of quantification. Based on available effect-based trigger values (EBT) for estrogenic and glucocorticoid activity, some of the wastewater and surface water samples were found to exceed the EBT, suggesting these environmental waters may pose a potential risk to ecosystem health. In contrast, the lack of bioassay activity and low detected chemical concentrations in the drinking water samples do not suggest a risk to human endocrine health, with all samples below the relevant EBTs. All rights reserved, Elsevier


Health protective behavior following required arsenic testing under the New Jersey Private Well Testing Act.

Author(s): Flanagan, S. V.; Gleason, J. A.; Spayd, S. E.; Procopio, N. A.; Rockafellow-Baldoni, M.; Braman, S.; Chillrud, S. N.; Yan Zheng

Source: International Journal of Hygiene and Environmental Health, 221 (6):929-940; 10.1016/j.ijheh.2018.05.008 2018

Abstract: Exposure to naturally occurring arsenic in groundwater is a public health concern, particularly for households served by unregulated private wells. At present, one of the greatest barriers to exposure reduction is a lack of private well testing due to difficulties in motivating individual private well owners to take protective actions. Policy and regulations requiring testing could make a significant contribution towards universal screening of private well water and arsenic exposure reduction. New Jersey’s Private Well Testing Act (PWTA) requires tests for arsenic during real estate transactions; however, the regulations do not require remedial action when maximum contaminant levels (MCLs) are exceeded. A follow-up survey sent to residents of homes where arsenic was measured above the state MCL in PWTA-required tests reveals a range of mitigation behavior among respondents (n = 486), from taking no action to reduce exposure (28%), to reporting both treatment use and appropriate maintenance and monitoring behavior (15%). Although 86% of respondents recall their well was tested during their real estate transaction, only 60% report their test showed an arsenic problem. Treatment systems are used by 63% of households, although half were installed by a previous owner. Among those treating their water (n = 308), 57% report that maintenance is being performed as recommended, although only 31% have tested the treated water within the past year. Perceived susceptibility and perceived barriers are strong predictors of mitigation action. Among those treating for arsenic, perceived severity is associated with recent monitoring, and level of commitment is associated with proper maintenance. Mention of a treatment service agreement is a strong predictor of appropriate monitoring and maintenance behavior, while treatment installed by a previous owner is less likely to be maintained. Though the PWTA requires that wells be tested, this study finds that not all current well owners are aware the test occurred or understood the implications of their arsenic results. Among those that have treatment installed to remove arsenic, poor monitoring and maintenance behaviors threaten to undermine intentions to reduce exposure. Findings suggest that additional effort, resources, and support to ensure home buyers pay attention to, understand, and act on test results at the time they are performed may help improve management of arsenic water problems over the long term and thus the PWTA’s public health impact.


Exploratory Assessment of Risks from Drinking and Recreational Water Exposure to Children in the State of New Jersey

Brandon M. Owen and Neha Sunger * 

*Author to whom correspondence should be addressed.


In this study, we conducted a worst-case risk assessment for children’s health from ingestion exposure to water sources in two densely populated counties of the Piedmont province of New Jersey—Hunterdon and Mercer counties. Carcinogenic and non-carcinogenic health risk estimates for 19 contaminants, representing 3 different chemical classes—organic, inorganic and contaminants of emerging concern (CEC), for which environmental monitoring data are available—were generated. The three exposure scenarios examined were: (1) ingestion exposure to untreated groundwater from contaminated private wells; (2) recreational exposure through incidental ingestion of water from the Delaware River; and (3) ingestion exposure through fish consumption sourced from the Delaware River. The total health hazard posed by each contaminant across all the three exposure scenarios was compared to prioritize contaminants based on health risk potential. As a result of this analysis, arsenic and trichloroethylene in private well water were identified as key drivers of health risk and, hence, are proposed as the contaminants of primary concern for the target population. Significantly high total excess cancer risk of 2.13 × 10−3 from arsenic exposure was estimated, highlighting the need for testing and treating water sources as well as setting a framework for more detailed work in the future.


Drinking water in West Virginia (USA): tap water or bottled water – what is the right choice for college students?

J Water Health (2018) 16 (5): 827-838.


West Virginia has had a history of water quality issues. In parallel, the world is facing a plastic pollution crisis. In order to better understand behavioral responses to perceived water quality, a survey was conducted at a major research university to ask participants about water quality perceptions and drinking water behaviors. A total of 4,188 students completed the survey during the Spring 2017 semester. Logistic regression analyses were used to predict behaviors. Results indicated that a third of the student population primarily used bottled water for drinking purposes at home, while 39% used a filter at home and 26% drank water directly from the tap. On campus, bottled water use was reported by 36% of the students, water fountain use represented 31%, and 29% of the students brought their own water with reusable cups/bottles. Health risk perceptions, organoleptic perceptions (i.e., taste, odor, color), and environmental concern were predictors of the different behaviors. Students originally from West Virginia had a higher propensity of using bottled water. We argue that bottled water consumption should be reduced in areas where water quality is not an issue. In this sense, there is a need for education among the student population in West Virginia.https://iwaponline.com/jwh/article-abstract/16/5/827/63663/Drinking-water-in-West-Virginia-USA-tap-water-or?redirectedFrom=fulltext

Environmentally Relevant Chemical Mixtures of Concern in Waters of United States Tributaries to the Great Lakes

Elliott, SM; Brigham, ME; Kiesling, RL; Schoenfuss, HL; Jorgenson, ZG


Abstract: The North American Great Lakes are a vital natural resource that provide fish and wildlife habitat, as well as drinking water and waste assimilation services for millions of people. Tributaries to the Great Lakes receive chemical inputs from various point and nonpoint sources, and thus are expected to have complex mixtures of chemicals. However, our understanding of the co-occurrence of specific chemicals in complex mixtures is limited. To better understand the occurrence of specific chemical mixtures in the US Great Lakes Basin, surface water from 24 US tributaries to the Laurentian Great Lakes was collected and analyzed for diverse suites of organic chemicals, primarily focused on chemicals of concern (e.g., pharmaceuticals, personal care products, fragrances). A total of 181 samples and 21 chemical classes were assessed for mixture compositions. Basin wide, 1664 mixtures occurred in at least 25% of sites. The most complex mixtures identified comprised 9 chemical classes and occurred in 58% of sampled tributaries. Pharmaceuticals typically occurred in complex mixtures, reflecting pharmaceutical-use patterns and wastewater facility outfall influences. Fewer mixtures were identified at lake or lake-influenced sites than at riverine sites. As mixture complexity increased, the probability of a specific mixture occurring more often than by chance greatly increased, highlighting the importance of understanding source contributions to the environment. This empirically based analysis of mixture composition and occurrence may be used to focus future sampling efforts or mixture toxicity assessments. (C) 2018 SETAC


Toxicological risk assessment and prioritization of drinking water relevant contaminants of emerging concern

Baken, Kirsten A.; Sjerps, Rosa M. A.; Schriks, Merijn; van Wezel, Annemarie P.

ENVIRONMENT INTERNATIONAL, 118 293-303; 10.1016/j.envint.2018.05.006 SEP 2018

Abstract: Toxicological risk assessment of contaminants of emerging concern (CEC) in (sources of) drinking water is required to identify potential health risks and prioritize chemicals for abatement or monitoring. In such assessments, concentrations of chemicals in drinking water or sources are compared to either (i) health-based (statutory) drinking water guideline values, (ii) provisional guideline values based on recent toxicity data in absence of drinking water guidelines, or (iii) generic drinking water target values in absence of toxicity data. Here, we performed a toxicological risk assessment for 163 CEC that were selected as relevant for drinking water. This relevance was based on their presence in drinking water and/or groundwater and surface water sources in downstream parts of the Rhine and Meuse, in combination with concentration levels and physicochemical properties. Statutory and provisional drinking water guideline values could be derived from publically available toxicological information for 142 of the CEC. Based on measured concentrations it was concluded that the majority of substances do not occur in concentrations which individually pose an appreciable human health risk. A health concern could however not be excluded for vinylchloride, trichloroethene, bromodichloromethane, aniline, phenol, 2-chlorobenzenamine, mevinphos, 1,4-dioxane, and nitrolotriacetic acid. For part of the selected substances, toxicological risk assessment for drinking water could not be performed since either toxicity data (hazard) or drinking water concentrations (exposure) were lacking. In absence of toxicity data, the Threshold of Toxicological Concern (TTC) approach can be applied for screening level risk assessment. The toxicological information on the selected substances was used to evaluate whether drinking water target values based on existing TTC levels are sufficiently protective for drinking water relevant CEC. Generic drinking water target levels of 37 mu g/L for Cramer class I substances and 4 mu g/L for Cramer class III substances in drinking water were derived based on these CEC. These levels are in line with previously reported generic drinking water target levels based on original TTC values and are shown to be protective for health effects of the majority of contaminants of emerging concern evaluated in the present study. Since the human health impact of many chemicals appearing in the water cycle has been studied insufficiently, generic drinking water target levels are useful for early warning and prioritization of CEC with unknown toxicity in drinking water and its sources for future monitoring.




Recently detected drinking water contaminants: genX and other Per- and polyfluoroalkyl ether acids.

Hopkins, Z. R.; Sun, M.; DeWitt, J. C.; Knappe, D. R. U.

Journal – American Water Works Association, 110 (7):13-28; 10.1002/awwa.1073 2018

Abstract:For several decades, a common processing aid in the production of fluoropolymers was the ammonium salt of perfluorooctanoic acid (PFOA). Because PFOA is persistent, bioaccumulative, and toxic, its production and use are being phased out in the United States. In 2009, the US Environmental Protection Agency stipulated conditions for the manufacture and commercial use of GenX, a PFOA replacement. While GenX is produced for commercial purposes, the acid form of GenX is also generated as a byproduct during the production of fluoromonomers. The discovery of high concentrations of GenX and related perfluoroalkyl ether acids (PFEAs) in the Cape Fear River and in finished drinking water of more than 200,000 North Carolina residents required quick action by researchers, regulators, public health officials, commercial laboratories, drinking water providers, and consulting engineers. Information about sources and toxicity of GenX as well as an analytical method for the detection of GenX and eight related PFEAs is presented. GenX/PFEA occurrence in water and GenX/PFEA removal by different drinking water treatment processes are also discussed. © 2018 American Water Works Association.


Differential development of Legionella sub-populations during short- and long-term starvation

Schrammel, Barbara; Cervero-Arago, Silvia; Dietersdorfer, Elisabeth; Walochnik, Julia; Lueck, Christian; Sommer, Regina; Kirschner, Alexander

WATER RESEARCH, 141 417-427; 10.1016/j.watres.2018.04.027 SEP 15 2018

Abstract: Legionellae are among the most important waterborne pathogens in industrialized countries. Monitoring and surveillance of Legionella in engineered water systems is usually performed with culture-based methods. Since the advent of culture-independent techniques, it has become clear that Legionella concentrations are often several orders of magnitude higher than those measured by culture-based techniques and that a variable proportion of these non-culturable cells are viable. In engineered water systems, the formation of these viable but non-culturable (VBNC) cells can be caused by different kinds of stress, such as, and most importantly, nutrient starvation, oxidative stress and heat. In this study, the formation of VBNC cells of six Legionella strains under conditions of starvation was monitored in mono species microcosms for up to one year using a combination of different viability indicators. Depending on the strain, complete loss of culturability was observed from 11 days to 8 weeks. During the starvation process, three distinct phases and different sub-populations of VBNC cells were identified. Until complete loss of culturability, the number of membrane-intact cells decreased rapidly to 5.5-69% of the initial cell concentration. The concentration of the sub-population with low esterase activity dropped to 0.03-55%, and the concentration of the highly esterase-active sub-population dropped to 0.01-1.2% of the initial concentration; these sub-populations remained stable for several weeks to months. Only after approximately 200 days of starvation, the number of VBNC cells started to decrease below detection limits. The most abundant VBNC sub-populations were characterized by partially damaged membranes and low esterase-activity. With this study, we showed that upon starvation, a stable VBNC Legionella community may be present over several months in a strain-dependent manner even under harsh conditions. Even after one year of starvation, a small proportion of L pneumophila cells with high esterase-activity was detected. We speculate that this highly active VBNC subpopulation is able to infect amoebae and human macrophages. (C) 2018 The Authors. Published by Elsevier Ltd.



Risk factors for sporadic Giardia infection in the USA: a case-control study in Colorado and Minnesota

Reses, H. E.; Gargano, J. W.; Liang, J. L.; Cronquist, A.; Smith, K.; Collier, S. A.; Roy, S. L.; Vanden Eng, J.; Bogard, A.; Lee, B.; Hlavsa, M. C.; Rosenberg, E. S.; Fullerton, K. E.; Beach, M. J.; Yoder, J. S.

EPIDEMIOLOGY AND INFECTION, 146 (9):1071-1078; 10.1017/S0950268818001073 JUL 2018

Abstract: Giardia duodenalis is the most common intestinal parasite of humans in the USA, but the risk factors for sporadic (non-outbreak) giardiasis are not well described. The Centers for Disease Control and Prevention and the Colorado and Minnesota public health departments conducted a case-control study to assess risk factors for sporadic giardiasis in the USA. Cases (N = 199) were patients with non-outbreak-associated laboratory-confirmed Giardia infection in Colorado and Minnesota, and controls (N = 381) were matched by age and site. Identified risk factors included international travel (aOR = 13.9; 95% CI 4.9-39.8), drinking water from a river, lake, stream, or spring (aOR = 6.5; 95% CI 2.0-20.6), swimming in a natural body of water (aOR = 3.3; 95% CI 1.5-7.0), male-male sexual behaviour (aOR = 45.7; 95% CI 5.8-362.0), having contact with children in diapers (aOR = 1.6; 95% CI 1.01-2.6), taking antibiotics (aOR = 2.5; 95% CI 1.2-5.0) and having a chronic gastrointestinal condition (aOR = 1.8; 95% CI 1.1-3.0). Eating raw produce was inversely associated with infection (aOR = 0.2; 95% CI 0.1-0.7). Our results highlight the diversity of risk factors for sporadic giardiasis and the importance of non-international-travel-associated risk factors, particularly those involving person-to-person transmission. Prevention measures should focus on reducing risks associated with diaper handling, sexual contact, swimming in untreated water, and drinking untreated water.


Assessment of the Water Treatment Process’s Empirical Model Predictions for the Management of Aesthetic and Health Risks Associated with Cyanobacteria

Zamyadi, Arash; Henderson, Rita K.; Newton, Kelly; Capelo-Neto, Jose; Newcombe, Gayle

WATER, 10 (5):10.3390/w10050590 MAY 2018

Abstract: Potentially toxic cyanobacteria have been increasingly detected worldwide in water supply systems in recent years. The management of cyanobacteria in source water and through drinking water treatment processes has been a focus of global research for over thirty years. However, despite the volume of research outcomes and the publication of guidance documents, gaps still exist in the knowledge base that inhibits the confident application of individual treatment strategies for the mitigation of aesthetic and health risks associated with cyanobacteria and their metabolites at the full-scale. The main objective of this project is to deliver a suite of tools and other resources to the water industry to support the implementation of a regulatory framework for the management of water quality for the assessment and management of aesthetic and toxicity risks associated with cyanobacteria. This study includes (1) the development of a guide (based on real-world examples) for treatment plant operators to perform plant audits and investigative sampling to assess the risk associated with cyanobacteria in their plants, and validate the performance of existing unit processes, and (2) the validation of a treatment model that can be applied at any plant and used to as a guide to the removals of cyanobacteria and metabolites and the expected quality of treated water under a range of challenges from cyanobacteria. Full-scale sampling was undertaken at three Australian regions in 14 water treatment plants to validate the model. The results presented in this paper represent a comprehensive database of full-scale removal efficiencies of 2-methylisoborneol (MIB) and geosmin for a range of water quality and treatment processes. The major findings and conclusions from this project include: (1) the investigative sampling procedures developed are effective and have been successfully applied by utilities; and (2) while routine monitoring data is important, investigative sampling within the water treatment plant provides more detailed and insightful information about the effectiveness of unit processes within the plant. This paper also identifies the knowledge gaps and needs for further studies.




Real-Time Online Monitoring for Assessing Removal of Bacteria by Reverse Osmosis

Fujioka, Takahiro; Hoang, Anh T.; Aizawa, Hidenobu; Ashiba, Hiroki; Fujimaki, Makoto; Leddy, Menu

ENVIRONMENTAL SCIENCE & TECHNOLOGY LETTERS, 5 (6):389-393; 10.1021/acs.estlett.8b00200 JUN 2018

Abstract: Rigorous monitoring of microbial water quality is essential to ensure the safety of recycled water after advanced treatment for indirect and direct potable reuse. This study evaluated real-time bacterial monitoring for assessing reverse osmosis (RO) treatment for removal of bacteria. A strategy was employed to monitor bacterial counts online and in real time in the RO feed and permeate water using a real time continuous bacteriological counter. Over the course of 68 h pilot-scale testing, bacterial counts were monitored in real time over approximate ranges from 1 x 10(3) to 4 x 10(4) and from 4 to 342 counts/mL in the RO feed (ultrafiltration treated wastewater) and permeate, respectively. The results indicate that the bacteriological counter can track the variations in bacterial counts in the RO feed and permeate. Bacterial concentrations were confirmed by epi-fluorescence microscopy for total bacterial counts. A high correlation (R-2 = 0.83) was identified between the online bacterial counts and epi-fluorescence counts in the RO feed; a negligible correlation was observed for RO permeate. In this study, we evaluated a real-time bacteriological counter (i.e., counts per milliliter every second) to ensure continuous removal of bacterial contaminants by RO treatment.


Comparison of Database Search Methods for the Detection of Legionella Pneumophila in Water Samples Using Metagenomic Analysis

Borthong, Jednipit; Omori, Ryosuke; Sugimoto, Chihiro; Suthienkul, Orasa; Nakao, Ryo; Ito, Kimihito

FRONTIERS IN MICROBIOLOGY, 9 10.3389/fmicb.2018.01272 JUN 19 2018

Abstract: Metagenomic analysis has become a powerful tool to analyze bacterial communities in environmental samples. However, the detection of a specific bacterial species using metagenomic analysis remains difficult due to false positive detections of sequences shared between different bacterial species. In this study, 16S rRNA amplicon and shotgun metagenomic analyses were conducted on samples collected along a stream and ponds in the campus of Hokkaido University. We compared different database search methods for bacterial detection by focusing on Legionella pneumophila. In this study, we used L. pneumophila-specific nested PCR as a gold standard to evaluate the results of the metagenomic analysis. Comparison with the results from L. pneumophila-specific nested PCR indicated that a blastn search of shotgun reads against the NCBI-NT database led to false positive results and had problems with specificity. We also found that a blastn search of shotgun reads against a database of the catalase-peroxidase (katB) gene detected L. pneumophila with the highest area under the receiver operating characteristic curve among the tested search methods; indicating that a blastn search against the katB gene database had better diagnostic ability than searches against other databases. Our results suggest that sequence searches targeting long genes specifically associated with the bacterial species of interest is a prerequisite to detecting the bacterial species in environmental samples using metagenomic analyses.


National trends in drinking water quality violations

Allaire, M.; Haowei Wu; Lall, U.

Proceedings of the National Academy of Sciences of the United States of America, 115 (9):2078-2083; 10.1073/pnas.1719805115 2018


Ensuring safe water supply for communities across the United States is a growing challenge in the face of aging infrastructure, impaired source water, and strained community finances. In the aftermath of the Flint lead crisis, there is an urgent need to assess the current state of US drinking water. However, no nationwide assessment has yet been conducted on trends in drinking water quality violations across several decades. Efforts to reduce violations are of national concern given that, in 2015, nearly 21 million people relied on community water systems that violated health-based quality standards. In this paper, we evaluate spatial and temporal patterns in health-related violations of the Safe Drinking Water Act using a panel dataset of 17,900 community water systems over the period 1982–2015. We also identify vulnerability factors of communities and water systems through probit regression. Increasing time trends and violation hot spots are detected in several states, particularly in the Southwest region. Repeat violations are prevalent in locations of violation hot spots, indicating that water systems in these regions struggle with recurring issues. In terms of vulnerability factors, we find that violation incidence in rural areas is substantially higher than in urbanized areas. Meanwhile, private ownership and purchased water source are associated with compliance. These findings indicate the types of underperforming systems that might benefit from assistance in achieving consistent compliance. We discuss why certain violations might be clustered in some regions and strategies for improving national drinking water quality.


A Progress Report on Efforts to Address Lead by Public School Districts

Sanborn, L. H.; Carpenter, A. T.

Journal – American Water Works Association, 110 (3):E18-E33; 10.1002/awwa.1022 2018


Media reports in 2016 brought lead contamination of drinking water to public attention, particularly at schools where young students can be exposed to lead by drinking at contaminated outlets. In an effort to assess nationwide progress on addressing this potential health risk, this study sought to determine the status of lead testing, remediation, and long‐term management strategies in public school districts serving the nation’s 15 most populous urbanized areas. Data were collected from publicly available information and through direct interaction with school districts. All districts under consideration have implemented some form of US Environmental Protection Agency‐recommended lead testing program, and districts with elevated lead levels have performed corrective actions including flushing, outlet repairs/replacement, and filtration. This study outlines districts’ testing programs, approaches to lead‐management, plans for continued monitoring, communication strategies, and self‐assessed successes and challenges.


Variability in the chemistry of private drinking water supplies and the impact of domestic treatment systems on water quality

Ander, E. L.; Watts, M. J.; Smedley, P. L.; Hamilton, E. M.; Close, R.; Crabbe, H.; Fletcher, T.; Rimell, A.; Studden, M.; Leonardi, G.

ENVIRONMENTAL GEOCHEMISTRY AND HEALTH, 38 (6):1313-1332; 10.1007/s10653-016-9798-0 DEC 2016


Tap water from 497 properties using private water supplies, in an area of metalliferous and arsenic mineralisation (Cornwall, UK), was measured to assess the extent of compliance with chemical drinking water quality standards, and how this is influenced by householder water treatment decisions. The proportion of analyses exceeding water quality standards were high, with 65 % of tap water samples exceeding one or more chemical standards. The highest exceedances for health-based standards were nitrate (11 %) and arsenic (5 %). Arsenic had a maximum observed concentration of 440 µg/L. Exceedances were also high for pH (47 %), manganese (12 %) and aluminium (7 %), for which standards are set primarily on aesthetic grounds. However, the highest observed concentrations of manganese and aluminium also exceeded relevant health-based guidelines. Significant reductions in concentrations of aluminium, cadmium, copper, lead and/or nickel were found in tap waters where households were successfully treating low-pH groundwaters, and similar adventitious results were found for arsenic and nickel where treatment was installed for iron and/or manganese removal, and successful treatment specifically to decrease tap water arsenic concentrations was observed at two properties where it was installed. However, 31 % of samples where pH treatment was reported had pH < 6.5 (the minimum value in the drinking water regulations), suggesting widespread problems with system maintenance. Other examples of ineffectual treatment are seen in failed responses post-treatment, including for nitrate. This demonstrates that even where the tap waters are considered to be treated, they may still fail one or more drinking water quality standards. We find that the degree of drinking water standard exceedances warrant further work to understand environmental controls and the location of high concentrations. We also found that residents were more willing to accept drinking water with high metal (iron and manganese) concentrations than international guidelines assume. These findings point to the need for regulators to reinforce the guidance on drinking water quality standards to private water supply users, and the benefits to long-term health of complying with these, even in areas where treated mains water is widely available.


Determining the presence of chemicals with suspected endocrine activity in drinking water from the Madrid region (Spain) and assessment of their estrogenic, androgenic and thyroidal activities

Valcarcel, Y.; Valdehita, A.; Becerra, E.; Alda, M. L. de; Gil, A.; Gorga, M.; Petrovic, M.; Barcelo, D.; Navas, J. M.

Chemosphere, 201 388-398; 10.1016/j.chemosphere.2018.02.099 2018


Endocrine disruptors (EDs) are natural or man-made chemicals that can affect the health of organisms by interfering with their normal hormonal functions. Many of these substances can cause their effects at very low doses and, considering the key role played by the endocrine system on development, organisms in early phases of growth (foetal, childhood, puberty) are especially sensitive to the action of EDs. In addition, when combined, they can show additive, antagonistic and synergistic activities. Taking all this into account it is essential to determine the presence of this kind of compounds in drinking water. Thus the main aim of the present study was to monitor the presence of substances with suspected or known endocrine activity in drinking water of the Madrid Region (MR) (Central Spain) and determine possible estrogenic, androgenic, or thyroidal activities. Water samples were collected at different times from a number of supply points that received water from reservoirs or rivers. The sampling point with the highest concentration of the analysed substances (up to 30 compounds) was DW1 (1203 ng L−1). This sampling point receives water from a drinking water treatment plant (DWTP) that serves the population from the south of the MR with treated water from the Tajuña River. DW2 was the second point with the highest concentration of the analysed substances (1021 ng L−1). DW2 receives water from one of the reservoirs in the north of the MR. The highest daily concentrations detected corresponded to the flame retardant Tris (2-chloroethyl)phosphate (TCEP) (266.55 ng L−1) and to the nonylphenol diethoxylate (188.57 ng L−1) at points DW1 and DW4, respectively, both of which are supplied with treated river water. None of the water samples exhibited androgenic, oestrogenic, or thyroidal activities in in vitro assays based on cells stably transfected with the receptors of interest and luciferase as reporter gene. These results demonstrate that water quality in the MR is high and does not present a health risk for the population, although the concentrations of some substances justify the need for local authorities to continually monitor the presence of these contaminants in order to implement any corrective measures if necessary.


Risk governance of potential emerging risks to drinking water quality: Analysing current practices

Hartmann, Julia; van der Aa, Monique; Wuijts, Susanne; Husman, Ana Maria de Roda; van der Hoek, Jan Peter

ENVIRONMENTAL SCIENCE & POLICY, 84 97-104; 10.1016/j.envsci.2018.02.015 JUN 2018


The presence of emerging contaminants in the aquatic environment may affect human health via exposure to drinking water. And, even if some of these emerging contaminants are not a threat to human health, their presence might still influence the public perception of drinking water quality. Over the last decades, much research has been done on emerging contaminants in the aquatic environment, most of which has focused on the identification of emerging contaminants and the characterisation of their toxic potential. However, only limited information is available on if, and how, scientific information is implemented in current policy approaches. The opportunities for science to contribute to the policy of emerging contaminants in drinking water have, therefore, not yet been identified.

A comparative analysis was performed of current approaches to the risk governance of emerging chemical contaminants in drinking water (resources) to identify any areas for improvement. The policy approaches used in the Netherlands, Germany, Switzerland and the state of Minnesota were analysed using the International Risk Governance Council framework as a normative concept. Quality indicators for the analysis were selected based on recent literature. Information sources used were scientific literature, policy documents, and newspaper articles.

Subsequently, suggestions for future research for proactive risk governance are given. Suggestions include the development of systematic analytical approaches to various information sources so that potential emerging contaminants to drinking water quality can be identified quickly. In addition, an investigation into the possibility and benefit of including the public concern about emerging contaminants into the risk governance process was encouraged.


Bottled aqua incognita: microbiota assembly and dissolved organic matter diversity in natural mineral waters

Lesaulnier, Celine C.; Herbold, Craig W.; Pelikan, Claus; Berry, David; Gerard, Cedric; Le Coz, Xavier; Gagnot, Sophie; Niggemann, Jutta; Dittmar, Thorsten; Singer, Gabriel A.; Loy, Alexander

MICROBIOME, 5 10.1186/s40168-017-0344-9SEP 22 2017



Non-carbonated natural mineral waters contain microorganisms that regularly grow after bottling despite low concentrations of dissolved organic matter (DOM). Yet, the compositions of bottled water microbiota and organic substrates that fuel microbial activity, and how both change after bottling, are still largely unknown.


We performed a multifaceted analysis of microbiota and DOM diversity in 12 natural mineral waters from six European countries. 16S rRNA gene-based analyses showed that less than 10 species-level operational taxonomic units (OTUs) dominated the bacterial communities in the water phase and associated with the bottle wall after a short phase of post-bottling growth. Members of the betaproteobacterial genera CurvibacterAquabacterium, and Polaromonas (Comamonadaceae) grew in most waters and represent ubiquitous, mesophilic, heterotrophic aerobes in bottled waters. Ultrahigh-resolution mass spectrometry of DOM in bottled waters and their corresponding source waters identified thousands of molecular formulae characteristic of mostly refractory, soil-derived DOM.


The bottle environment, including source water physicochemistry, selected for growth of a similar low-diversity microbiota across various bottled waters. Relative abundance changes of hundreds of multi-carbon molecules were related to growth of less than ten abundant OTUs. We thus speculate that individual bacteria cope with oligotrophic conditions by simultaneously consuming diverse DOM molecules.


Improved Detection of Norovirus and Hepatitis A Virus in Surface Water by Applying Pre-PCR Processing

Borgmastars, E.; Jazi, M. M.; Persson, S.; Jansson, L.; Radstrom, P.; Simonsson, M.; Hedman, J.; Eriksson, R.

Food and Environmental Virology, 9 (4):395-405; 10.1007/s12560-017-9295-32017

Abstract: Quantitative reverse transcriptase polymerase chain reaction (RT-qPCR) detection of waterborne RNA viruses generally requires concentration of large water volumes due to low virus levels. A common approach is to use dead-end ultrafiltration followed by precipitation with polyethylene glycol. However, this procedure often leads to the co-concentration of PCR inhibitors that impairs the limit of detection and causes false-negative results. Here, we applied the concept of pre-PCR processing to optimize RT-qPCR detection of norovirus genogroup I (GI), genogroup II (GII), and hepatitis A virus (HAV) in challenging water matrices. The RT-qPCR assay was improved by screening for an inhibitor-tolerant master mix and modifying the primers with twisted intercalating nucleic acid molecules. Additionally, a modified protocol based on chaotropic lysis buffer and magnetic silica bead nucleic acid extraction was developed for complex water matrices. A validation of the modified extraction protocol on surface and drinking waters was performed. At least a 26-fold improvement was seen in the most complex surface water studied. The modified protocol resulted in average recoveries of 33, 13, 8, and 4% for mengovirus, norovirus GI, GII, and HAV, respectively. The modified protocol also improved the limit of detection for norovirus GI and HAV. RT-qPCR inhibition with C q shifts of 1.6, 2.8, and 3.5 for norovirus GI, GII, and HAV, respectively, obtained for the standard nucleic acid extraction were completely eliminated by the modified protocol. The standard nucleic acid extraction method worked well on drinking water with no RT-qPCR inhibition observed and average recoveries of 80, 124, 89, and 32% for mengovirus, norovirus GI, GII, and HAV, respectively.


Development of a Cryptosporidium-arsenic multi-risk assessment model for infant formula prepared with tap water in France

Boue, G.; Wasiewska, L. A.; Cummins, E.; Antignac, J. P.; Bizec, B. le; Guillou, S.; Membre, J. M.

Food Research International, 108 558-570; 10.1016/j.foodres.2018.03.0542018


Tap water is used in France to reconstitute powder infant formula, although it is not sterile and possibly contaminated by microbiological and chemical hazards. The present study aims to quantify risks of using tap water in France for the preparation of infant formula, during the first six months of life.

Cryptosporidium and arsenic were selected as hazards of greatest concern in microbiology and chemistry, respectively. A probabilistic model was developed using French (when available) and European (alternatively) data. Second order Monte Carlo simulation was used to separate uncertainty and variability of inputs. Outputs were expressed at the individual level as probability of illness and at the population level, using a common metric, the DALY (Disability Adjusted Life Year). Two scenarios of milk preparation were considered: with un-boiled or boiled tap water.

Consuming infant formula rehydrated with un-boiled tap water during the first six months of life led to a total of 2250 DALYs per 100,000 infants (90% uncertainty interval [960; 7650]) for Cryptosporidium due to diarrhea, and 1 DALY [0.4; 2] for arsenic due to expected lifetime risk of lung and bladder cancer as a result of early exposure in life. For the entire population, boiling water would suppress the risk from Cryptosporidium. In contrast, the incremental cancer risk was low at the population level but elevated for 5% of the population exposed to high levels of arsenic. A stringent monitoring of tap water supply points should be continued. This multi-risk assessment model could help public health authorities and managers in evaluating both microbiological and chemical safety issues associated with using infant formula prepared with tap water.


Understanding human infectious Cryptosporidium risk in drinking water supply catchments

Swaffer, B.; Abbott, H.; King, B.; Linden, L. van der; Monis, P.

Water Research, 138 282-292; 10.1016/j.watres.2018.03.0632018


Treating drinking water appropriately depends, in part, on the robustness of source water quality risk assessments, however quantifying the proportion of infectious, human pathogenic Cryptosporidium oocysts remains a significant challenge. We analysed 962 source water samples across nine locations to profile the occurrence, rate and timing of infectious, human pathogenic Cryptosporidium in surface waters entering drinking water reservoirs during rainfall-runoff conditions. At the catchment level, average infectivity over the four-year study period reached 18%; however, most locations averaged <5%. The maximum recorded infectivity fraction within a single rainfall runoff event was 65.4%, and was dominated by C. parvum. Twenty-two Cryptosporidium species and genotypes were identified using PCR-based molecular techniques; the most common being C. parvum, detected in 23% of water samples. Associations between landuse and livestock stocking characteristics with Cryptosporidium were determined using a linear mixed-effects model. The concentration of pathogens in water were significantly influenced by flow and dominance of land-use by commercial grazing properties (as opposed to lifestyle properties) in the catchment (p < 0.01). Inclusion of measured infectivity and human pathogenicity data into a quantitative microbial risk assessment (QMRA) could reduce the source water treatment requirements by up to 2.67 log removal values, depending on the catchment, and demonstrated the potential benefit of collating such data for QMRAs.


Applicability of the direct injection liquid chromatographic tandem mass spectrometric analytical approach to the sub-ng L-1 determination of perfluoro-alkyl acids in waste, surface, ground and drinking water samples

Author Full Names: Ciofi, Lorenzo; Renai, Lapo; Rossini, Daniele; Ancillotti, Claudia; Falai, Alida; Fibbi, Donatella; Bruzzoniti, Maria Concetta; Juan Santana-Rodriguez, Jose; Orlandini, Serena; Del Bubba, Massimo
Source: TALANTA, 176 412-421; 10.1016/j.talanta.2017.08.052JAN 1 2018
Language: English

Abstract: The applicability of a direct injection UHPLC-MS/MS method for the analysis of several perfluoroalkyl acids (PFAAs) in a wide range of water matrices was investigated. The method is based on the direct injection of 100 mu L of centrifuged water sample, without any other sample treatment. Very good method detection limits (0.014-0.44 mu g L-1) and excellent intra and inter-day precision (RSD% values in the range 1.8-4.4% and 2.7-5.7%, respectively) were achieved, with a total analysis time of 20 min per sample. A high number of samples i.e. 8 drinking waters (DW), 12 ground waters (GW), 13 surface waters (SW), 8 influents and 11 effluents of wastewater treatment plants (WWTPIN and WWTPOUT) were processed and the extent of matrix effect (ME) was calculated, highlighting the strong prevalence of vertical bar ME vertical bar < 20%. The occurrence of vertical bar ME vertical bar > 50% was occasionally observed only for perfluorooctanesulphonic and perfluorodecanoic acids. Linear discriminant analysis highlighted the great contribution of the sample origin (i.e. DW, GW, SW, WWTPIN and WWTPOUT) to the ME. Partial least square regression (PLS) and leave-one-out cross-validation were performed in order to interpret and predict the signal suppression or enhancement phenomena as a function of physicochemical parameters of water samples (i.e. conductivity, hardness and chemical oxygen demand) and background chromatographic area. The PLS approach resulted only in an approximate screening, due to the low prediction power of the PLS models. However, for most analytes in most samples, the fitted and cross-validated values were such as to correctly distinguish between vertical bar ME vertical bar higher than 20% or below this limit. PFAAs in the aforementioned water samples were quantified by means of the standard addition method, highlighting their occurrence mainly in WWTP influents and effluents, at concentrations as high as one hundred of mu g L-1.

Climate change-induced increases in precipitation are reducing the potential for solar ultraviolet radiation to inactivate pathogens in surface waters

Author Full Names: Williamson, Craig E.; Madronich, Sasha; Lal, Aparna; Zepp, Richard G.; Lucas, Robyn M.; Overholt, Erin P.; Rose, Kevin C.; Schladow, S. Geoffrey; Lee-Taylor, Julia
Source: SCIENTIFIC REPORTS, 7 10.1038/s41598-017-13392-2OCT 12 2017
Language: English

Abstract: Climate change is accelerating the release of dissolved organic matter (DOM) to inland and coastal waters through increases in precipitation, thawing of permafrost, and changes in vegetation. Our modeling approach suggests that the selective absorption of ultraviolet radiation (UV) by DOM decreases the valuable ecosystem service wherein sunlight inactivates waterborne pathogens. Here we highlight the sensitivity of waterborne pathogens of humans and wildlife to solar UV, and use the DNA action spectrum to model how differences in water transparency and incident sunlight alter the ability of UV to inactivate waterborne pathogens. A case study demonstrates how heavy precipitation events can reduce the solar inactivation potential in Lake Michigan, which provides drinking water to over 10 million people. These data suggest that widespread increases in DOM and consequent browning of surface waters reduce the potential for solar UV inactivation of pathogens, and increase exposure to infectious diseases in humans and wildlife.

Occurrence of illicit drugs in water and wastewater and their removal during wastewater treatment.

Author(s): Meena K. Yadav; Short, M. D.; Rupak Aryal; Gerber, C.; Akker, B. van der; Saint, C. P.
Source: Water Research, 124 713-727; 10.1016/j.watres.2017.07.0682017

Abstract: This review critically evaluates the types and concentrations of key illicit drugs (cocaine, amphetamines, cannabinoids, opioids and their metabolites) found in wastewater, surface water and drinking water sources worldwide and what is known on the effectiveness of wastewater treatment in removing such compounds. It is also important to amass information on the trends in specific drug use as well as the sources of such compounds that enter the environment and we review current international knowledge on this. There are regional differences in the types and quantities of illicit drug consumption and this is reflected in the quantities detected in water. Generally, the levels of illicit drugs in wastewater effluents are lower than in raw influent, indicating that the majority of compounds can be at least partially removed by conventional treatment processes such as activated sludge or trickling filters. However, the literature also indicates that it is too simplistic to assume non-detection equates to drug removal and/or mitigation of associated risks, as there is evidence that some compounds may avoid detection via inadequate sampling and/or analysis protocols, or through conversion to transformation products. Partitioning of drugs from the water to the solids fraction (sludge/biosolids) may also simply shift the potential risk burden to a different environmental compartment and the review found no information on drug stability and persistence in biosolids. Generally speaking, activated sludge-type processes appear to offer better removal efficacy across a range of substances, but the lack of detail in many studies makes it difficult to comment on the most effective process configurations and operations. There is also a paucity of information on the removal effectiveness of alternative treatment processes. Research is also required on natural removal processes in both water and sediments that may over time facilitate further removal of these compounds in receiving environments.

Models for estimation of the presence of non-regulated disinfection by-products in small drinking water systems

Author Full Names: Guilherme, Stephanie; Rodriguez, Manuel J.
Source:ENVIRONMENTAL MONITORING AND ASSESSMENT, 189 (11):10.1007/s10661-017-6296-5NOV 2017

Abstract: Among all the organic disinfection by-products (DBPs), only trihalomethanes (THMs) and haloacetic acids (HAAs) are regulated in drinking water, while most DBPs are not. Very little information exists on the occurrence of non-regulated DBPs, particularly in small water systems (SWS). Paradoxically, SWS are more vulnerable to DBPs because of a low capacity to implement adequate treatment technologies to remove DBP precursors. Since DBP analyses are expensive, usually SWS have difficulties to implement a rigorous characterization of these contaminants. The purpose of this study was to estimate non-regulated DBP levels in SWS from easy measurements of relevant parameters regularly monitored. Since no information on non-regulated DBPs in SWS was available, a sampling program was carried out in 25 SWS in two provinces of Canada. Five DBP families were investigated: THMs, HAAs, haloacetonitriles (HANs), halonitromethanes (HNMs), and haloketones (HKs). Multivariate linear mixed regression models were developed to estimate HAN, HK, and HNM levels from water quality characteristics in the water treatment plant, concentrations of regulated DBPs, and residual disinfectant levels. The models obtained have a good explanatory capacity since R-2 varies from 0.77 to 0.91 according to compounds and conditions for application (season and type of treatment). Model validation with an independent database suggested their ability for generalization in similar SWS in North America.

Children’s Lead Exposure: A Multimedia Modeling Analysis to Guide Public Health Decision-Making

Author Full Names: Zartarian, Valerie; Xue, Jianping; Tornero-Velez, Rogelio; Brown, James

Abstract: BACKGROUND: Drinking water and other sources for lead are the subject of public health concerns around the Flint, Michigan, drinking water and East Chicago, Indiana, lead in soil crises. In 2015, the U.S. Environmental Protection Agency (EPA)’s National Drinking Water Advisory Council (NDWAC) recommended establishment of a “health-based, household action level” for lead in drinking water based on children’s exposure.

OBJECTIVES: The primary objective was to develop a coupled exposure-dose modeling approach that can be used to determine what drinking water lead concentrations keep children’s blood lead levels (BLLs) below specified values, considering exposures from water, soil, dust, food, and air. Related objectives were to evaluate the coupled model estimates using real-world blood lead data, to quantify relative contributions by the various media, and to identify key model inputs.

METHODS: A modeling approach using the EPA’s Stochastic Human Exposure and Dose Simulation (SHEDS)-Multimedia and Integrated Exposure Uptake and Biokinetic (IEUBK) models was developed using available data. This analysis for the U.S. population of young children probabilistically simulated multimedia exposures and estimated relative contributions of media to BLLs across all population percentiles for several age groups.

RESULTS: Modeled BLLs compared well with nationally representative BLLs (0-23% relative error). Analyses revealed relative importance of soil and dust ingestion exposure pathways and associated Pb intake rates; water ingestion was also a main pathway, especially for infants.

CONCLUSIONS: This methodology advances scientific understanding of the relationship between lead concentrations in drinking water and BLLs in children. It can guide national health-based benchmarks for lead and related community public health decisions. https://doi.org/10.1289/EHP1605.

Strategies to Improve Private-Well Water Quality: A North Carolina Perspective

Author Full Names: Gibson, Jacqueline MacDonald; Pieper, Kelsey J.

Abstract: BACKGROUND: Evidence suggests that the 44.5 million U.S. residents drawing their drinking water from private wells face higher risks of waterborne contaminant exposure than those served by regulated community water supplies. Among U.S. states, North Carolina (N.C.) has the second-largest population relying on private wells, making it a useful microcosm to study challenges to maintaining private-well water quality.

OBJECTIVES: This paper summarizes recommendations from a two-day summit to identify options to improve drinking-water quality for N.C. residents served by private wells. METHODS: The Research Triangle Environmental Health Collaborative invited 111 participants with knowledge of private-well water challenges to attend the Summit. Participants worked in small groups that focused on specific aspects and reconvened in plenary sessions to formulate consensus recommendations.

DISCUSSION: Summit participants highlighted four main barriers to ensuring safe water for residents currently relying on private wells: (1) a database of private well locations is unavailable; (2) racial disparities have perpetuated reliance on private wells in some urbanized areas; (3) many private well users lack information or resources to monitor and maintain their wells; and (4) private-well support programs are fragmented and lack sufficient resources. The Summit produced 10 consensus recommendations for ways to overcome these barriers.

CONCLUSIONS: The Summit recommendations, if undertaken, could improve the health of North Carolinians facing elevated risks of exposure to waterborne contaminants because of their reliance on inadequately monitored and maintained private wells. Because many of the challenges in N.C. are common nationwide, these recommendations could serve as models for other states.

Still Treating Lead Poisoning After All These Years

Twenty-five years ago, in a commentary published in Pediatrics, Drs Needleman and Jackson1 asked whether we would still be treating lead poisoning in the 21st century. Unfortunately, despite considerable progress, our public health system is still failing to prevent children from being lead poisoned and the specter of lead poisoning continues to cast a shadow over the country: over 500 000 American children have a blood lead level of >5 μg/dL (>50 ppb); 23 million homes have 1 or more lead hazards; an unknown number of Americans drink water from lead service lines; and federal standards for lead in house dust, soil, and water fail to protect children. We have understandably focused on the plight of children in Flint, Michigan, but children in hundreds of other cities have blood lead levels higher than the children of Flint.



Surveillance for Waterborne Disease Outbreaks Associated with Drinking Water — United States, 2013–2014

Benedict KM, Reses H, Vigar M, et al. Surveillance for Waterborne Disease Outbreaks Associated with Drinking Water — United States, 2013–2014. MMWR Morb Mortal Wkly Rep 2017;66:1216–1221. DOI: http://dx.doi.org/10.15585/mmwr.mm6644a3

Provision of safe water in the United States is vital to protecting public health (1). Public health agencies in the U.S. states and territories* report information on waterborne disease outbreaks to CDC through the National Outbreak Reporting System (NORS) (https://www.cdc.gov/healthywater/surveillance/index.html). During 2013–2014, 42 drinking water–associated outbreaks were reported, accounting for at least 1,006 cases of illness, 124 hospitalizations, and 13 deaths. Legionella was associated with 57% of these outbreaks and all of the deaths. Sixty-nine percent of the reported illnesses occurred in four outbreaks in which the etiology was determined to be either a chemical or toxin or the parasite Cryptosporidium. Drinking water contamination events can cause disruptions in water service, large impacts on public health, and persistent community concern about drinking water quality. Effective water treatment and regulations can protect public drinking water supplies in the United States, and rapid detection, identification of the cause, and response to illness reports can reduce the transmission of infectious pathogens and harmful chemicals and toxins.


Significant racial, ethnic, income disparities in hydration found among U.S. adults

Nearly a third of U.S. adults are not hydrated enough, and poorer adults as well as Black and Hispanic adults are at higher risk for poor hydration than wealthier and white adults, according to a new study from Harvard T.H. Chan School of Public Health.

Lack of access to clean, safe drinking water—as highlighted by recent water crises in communities such as Flint, Michigan—may be one of the main reasons for the disparities, the authors suggested.

The study appeared online July 20, 2017 in the American Journal of Public Health.

Effective Immediately: Healthcare Facilities Required to Reduce Legionellosis Risks from Tap Water

Published July 2017

By Kelly A. Reynolds, MSPH, PhD

If you follow On Tap frequently, you know that the bacterium, Legionella, has been a repeated topic in recent years. Once again, Legionella is at the forefront of discussions due to continuing waterborne outbreaks and new directives in healthcare facilities for prevention. On June 2, the Department of Health and Human Services, Centers for Medicare and Medicaid Services (CMS) issued a memo that will undoubtedly expand the awareness of Legionella risks and further drive the implementation of preventative approaches.

Nationwide reconnaissance of contaminants of emerging concern in source and treated drinking waters of the United States

Glassmeyer, S.T., et al., Science of The Total Environment, 581-582:909-922, March 2017

When chemical or microbial contaminants are assessed for potential effect or possible regulation in ambient and drinking waters, a critical first step is determining if the contaminants occur and if they are at concentrations that may cause human or ecological health concerns. To this end, source and treated drinking water samples from 29 drinking water treatment plants (DWTPs) were analyzed as part of a two-phase study to determine whether chemical and microbial constituents, many of which are considered contaminants of emerging concern, were detectable in the waters. Of the 84 chemicals monitored in the 9 Phase I DWTPs, 27 were detected at least once in the source water, and 21 were detected at least once in treated drinking water. In Phase II, which was a broader and more comprehensive assessment, 247 chemical and microbial analytes were measured in 25 DWTPs, with 148 detected at least once in the source water, and 121 detected at least once in the treated drinking water. The frequency of detection was often related to the analyte’s contaminant class, as pharmaceuticals and anthropogenic waste indicators tended to be infrequently detected and more easily removed during treatment, while per and polyfluoroalkyl substances and inorganic constituents were both more frequently detected and, overall, more resistant to treatment. The data collected as part of this project will be used to help inform evaluation of unregulated contaminants in surface water, groundwater, and drinking water.

Characterizing pharmaceutical, personal care product, and hormone contamination in a karst aquifer of southwestern Illinois, USA, using water quality and stream flow parameters

Dodgen, L.K., et.al., Science of the Total Environment, 578:281-289, February 2017

Karst aquifers are drinking water sources for 25% of the global population. However, the unique geology of karst areas facilitates rapid transfer of surficial chemicals to groundwater, potentially contaminating drinking water. Contamination of karst aquifers by nitrate, chloride, and bacteria have been previously observed, but little knowledge is available on the presence of contaminants of emerging concern (CECs), such as pharmaceuticals. Over a 17-month period, 58 water samples were collected from 13 sites in the Salem Plateau, a karst region in southwestern Illinois, United States. Water was analyzed for 12 pharmaceutical and personal care products (PPCPs), 7 natural and synthetic hormones, and 49 typical water quality parameters (e.g., nutrients and bacteria). Hormones were detected in only 23% of samples, with concentrations of 2.2–9.1 ng/L. In contrast, PPCPs were quantified in 89% of groundwater samples. The two most commonly detected PPCPs were the antimicrobial triclocarban, in 81% of samples, and the cardiovascular drug gemfibrozil, in 57%. Analytical results were combined with data of local stream flow, weather, and land use to 1) characterize the extent of aquifer contamination by CECs, 2) cluster sites with similar PPCP contamination profiles, and 3) develop models to describe PPCP contamination. Median detection in karst groundwater was 3 PPCPs at a summed concentration of 4.6 ng/L. Sites clustered into 3 subsets with unique contamination models. PPCP contamination in Cluster I sites was related to stream height, manganese, boron, and heterotrophic bacteria. Cluster II sites were characterized by groundwater temperature, specific conductivity, sodium, and calcium. Cluster III sites were characterized by dissolved oxygen and barium. Across all sites, no single or small set of water quality factors was significantly predictive of PPCP contamination, although gemfibrozil concentrations were strongly related to the sum of PPCPs in karst groundwater.

A decision analysis framework for estimating the potential hazards for drinking water resources of chemicals used in hydraulic fracturing fluids

Yost, E.E., Science of the Total Environment, 574:1544-1558, January 2017

Despite growing concerns over the potential for hydraulic fracturing to impact drinking water resources, there are limited data available to identify chemicals used in hydraulic fracturing fluids that may pose public health concerns. In an effort to explore these potential hazards, a multi-criteria decision analysis (MCDA) framework was employed to analyze and rank selected subsets of these chemicals by integrating data on toxicity, frequency of use, and physicochemical properties that describe transport in water. Data used in this analysis were obtained from publicly available databases compiled by the United States Environmental Protection Agency (EPA) as part of a larger study on the potential impacts of hydraulic fracturing on drinking water. Starting with nationwide hydraulic fracturing chemical usage data from EPA’s analysis of the FracFocus Chemical Disclosure Registry 1.0, MCDAs were performed on chemicals that had either noncancer toxicity values (n = 37) or cancer-specific toxicity values (n = 10). The noncancer MCDA was then repeated for subsets of chemicals reported in three representative states (Texas, n = 31; Pennsylvania, n = 18; and North Dakota, n = 20). Within each MCDA, chemicals received scores based on relative toxicity, relative frequency of use, and physicochemical properties (mobility in water, volatility, persistence). Results show a relative ranking of these chemicals based on hazard potential, and provide preliminary insight into chemicals that may be more likely than others to impact drinking water resources. Comparison of nationwide versus state-specific analyses indicates regional differences in the chemicals that may be of more concern to drinking water resources, although many chemicals were commonly used and received similar overall hazard rankings. Several chemicals highlighted by these MCDAs have been reported in groundwater near areas of hydraulic fracturing activity. This approach is intended as a preliminary analysis, and represents one possible method for integrating data to explore potential public health impacts.

A national reconnaissance of trace organic compounds (TOCs) in United States lotic ecosystems

Bernot, M.J., et al., Science of the Total Environment, 572:422-433, December 2016

We collaborated with 26 groups from universities across the United States to sample 42 sites for 33 trace organic compounds (TOCs) in water and sediments of lotic ecosystems. Our goals were 1) to further develop a national database of TOC abundance in United States lotic ecosystems that can be a foundation for future research and management, and 2) to identify factors related to compound abundance. Trace organic compounds were found in 93% of water samples and 56% of sediment samples. Dissolved concentrations were 10–1000 × higher relative to sediment concentrations. The ten most common compounds in water samples with detection frequency and maximum concentration were sucralose (87.5%, 12,000 ng/L), caffeine (77.5%, 420 ng/L), sulfamethoxazole (70%, 340 ng/L), cotinine (65%, 130 ng/L), venlafaxine (65%, 1800 ng/L), carbamazepine (62.5%, 320 ng/L), triclosan (55%, 6800 ng/L), azithromycin (15%, 970 ng/L), diphenylhydramine (40%, 350 ng/L), and desvenlafaxine (35%, 4600 ng/L). In sediment, the most common compounds were venlafaxine (32.5%, 19 ng/g), diphenhydramine (25%, 41 ng/g), azithromycin (15%, 11 ng/g), fluoxetine (12.5%, 29 ng/g) and sucralose (12.5%, 16 ng/g). Refractory compounds such as sucralose may be good indicators of TOC contamination in lotic ecosystems, as there was a correlation between dissolved sucralose concentrations and with the total number of compounds detected in water. Discharge and human demographic (population size) characteristics were not good predictors of compound abundance in water samples. This study further confirms the ubiquity of TOCs in lotic ecosystems. Although concentrations measured rarely approached acute aquatic-life criteria, the chronic effects, bioaccumulative potential, or potential mixture effects of multiple compounds are relatively unknown.

Atrazine in Kentucky drinking water: intermethod comparison of U.S. environmental protection agency analytical methods 507 and 508.1

Suhl, J., et al., Journal of Environmental Health, 79(5):E1-E6, December 2016

This study examines the analytical methods used to test drinking water for atrazine along with the seasonal variation of atrazine in drinking water. Samples from 117 counties throughout Kentucky from January 2000 to December 2008 were analyzed. Methods 507 and 508.1 were compared using the Mann-Whitney U test. Median values of these methods were similar (p = .7421). To examine seasonal variation, data from each year and from the entire period were analyzed using one-way ANOVA; pairwise multiple comparisons were made for years with significant differences. All the years except 2001, 2005, 2006, and 2007 had significantly different atrazine concentrations between seasons. The Seasonal Kendall Test for Trend was used to identify trends in atrazine over time. Yearly means ranged from 0.000043 mg/L (± 0.000011 mg/L) to 0.000995 mg/L (± 0.000510 mg/L). The highest levels were observed during spring in most years. A significant (p = .000092) decreasing trend of -7.6 x 10-6 mg/L/year was found. Decreasing trends were also present in all five regions of the state during this period. This study illustrates the need for changes in sampling methodology used today, so that effective exposure assessments can be conducted to study the public’s exposure to atrazine in drinking water.

Widespread copper and lead contamination of household drinking water, New South Wales, Australia

Harvey, P.J., et.al., Environmental Research, 151:275-285, November 2016

This study examines arsenic, copper, lead and manganese drinking water contamination at the domestic consumer’s kitchen tap in homes of New South Wales, Australia. Analysis of 212 first draw drinking water samples shows that almost 100% and 56% of samples contain detectable concentrations of copper and lead, respectively. Of these detectable concentrations, copper exceeds Australian Drinking Water Guidelines (ADWG) in 5% of samples and lead in 8%. By contrast, no samples contained arsenic and manganese water concentrations in excess of the ADWG. Analysis of household plumbing fittings (taps and connecting pipework) show that these are a significant source of drinking water lead contamination. Water lead concentrations derived for plumbing components range from 108µg/L to 1440µg/L (n=28, mean – 328µg/L, median – 225µg/L). Analysis of kitchen tap fittings demonstrates these are a primary source of drinking water lead contamination (n=9, mean – 63.4µg/L, median – 59.0µg/L). The results of this study demonstrate that along with other potential sources of contamination in households, plumbing products that contain detectable lead up to 2.84% are contributing to contamination of household drinking water. Given that both copper and lead are known to cause significant health detriments, products for use in contact with drinking water should be manufactured free from copper and lead.

Pb-Sr isotopic and geochemical constraints on sources and processes of lead contamination in well waters and soil from former fruit orchards, Pennsylvania, USA: A legacy of anthropogenic activities

Ayuso, R.A., and Foley, N.K., Journal of Geochemical Exploration, 170:125-147, November 2016

Isotopic discrimination can be an effective tool in establishing a direct link between sources of Pb contamination and the presence of anomalously high concentrations of Pb in waters, soils, and organisms. Residential wells supplying water containing up to 1600 ppb Pb to houses built on the former Mohr orchards commercial site, near Allentown, PA, were evaluated to discern anthropogenic from geogenic sources. Pb (n = 144) and Sr (n = 40) isotopic data and REE (n = 29) data were determined for waters from residential wells, test wells (drilled for this study), and surface waters from pond and creeks. Local soils, sediments, bedrock, Zn-Pb mineralization and coal were also analyzed (n = 94), together with locally used Pb-As pesticide (n = 5). Waters from residential and test wells show overlapping values of 206Pb/207Pb, 208Pb/207Pb and 87Sr/86Sr. Larger negative Ce anomalies (Ce/Ce*) distinguish residential wells from test wells. Results show that residential and test well waters, sediments from residential water filters in water tanks, and surface waters display broad linear trends in Pb isotope plots. Pb isotope data for soils, bedrock, and pesticides have contrasting ranges and overlapping trends. Contributions of Pb from soils to residential well waters are limited and implicated primarily in wells having shallow water-bearing zones and carrying high sediment contents. Pb isotope data for residential wells, test wells, and surface waters show substantial overlap with Pb data reflecting anthropogenic actions (e.g., burning fossil fuels, industrial and urban processing activities). Limited contributions of Pb from bedrock, soils, and pesticides are evident. High Pb concentrations in the residential waters are likely related to sediment build up in residential water tanks. Redox reactions, triggered by influx of groundwater via wells into the residential water systems and leading to subtle changes in pH, are implicated in precipitation of Fe oxyhydroxides, oxidative scavenging of Ce(IV), and desorption and release of Pb into the residential water systems. The Pb isotope features in the residences and the region are best interpreted as reflecting a legacy of industrial Pb present in underlying aquifers that currently supply the drinking water wells.

Malodorous volatile organic sulfur compounds: Sources, sinks and significance in inland waters

Watson, S.B, and Jüttner, F., Critical Reviews in Microbiology, 43(2):210-237, November 2016

Volatile Organic Sulfur Compounds (VOSCs) are instrumental in global S-cycling and greenhouse gas production. VOSCs occur across a diversity of inland waters, and with widespread eutrophication and climate change, are increasingly linked with malodours in organic-rich waterbodies and drinking-water supplies. Compared with marine systems, the role of VOSCs in biogeochemical processes is far less well characterized for inland waters, and often involves different physicochemical and biological processes. This review provides an updated synthesis of VOSCs in inland waters, focusing on compounds known to cause malodours. We examine the major limnological and biochemical processes involved in the formation and degradation of alkylthiols, dialkylsulfides, dialkylpolysulfides, and other organosulfur compounds under different oxygen, salinity and mixing regimes, and key phototropic and heterotrophic microbial producers and degraders (bacteria, cyanobacteria, and algae) in these environs. The data show VOSC levels which vary significantly, sometimes far exceeding human odor thresholds, generated by a diversity of biota, biochemical pathways, enzymes and precursors. We also draw attention to major issues in sampling and analytical artifacts which bias and preclude comparisons among studies, and highlight significant knowledge gaps that need addressing with careful, appropriate methods to provide a more robust understanding of the potential effects of continued global development.

Occurrence of DBPs in Drinking Water of European Regions for Epidemiology Studies

Krasner, S.J., et.al., American Water Works Association Journal, 108(10):501-512, October 2016

A three-year study was conducted on the occurrence of disinfection by-products (DBPs) – trihalomethanes (THMs), haloacetic acids (HAAs), and haloacetonitriles – in drinking water of regions of Europe where epidemiology studies were being carried out. Thirteen systems in six countries (i.e., Italy, France, Greece, Lithuania, Spain, United Kingdom) were sampled. Typically chlorinated DBPs dominated. However, in most of Catalonia (Spain) and in Heraklion (Greece), brominated DBPs dominated. The degree of bromine incorporation into the DBP classes was in general similar among them. This is important, as brominated DBPs are a greater health concern. In parts of Catalonia, the reported levels of tribromoacetic acid were higher than in other parts of the world. In some regions, the levels of HAAs tended to be peaked in concentration in a different time period than when the levels of THMs peaked. In most epidemiology studies, THMs are used as a surrogate for other halogenated DBPs. This study provides exposure assessment information for epidemiology studies.

Origin of Hexavalent Chromium in Drinking Water Wells from the Piedmont Aquifers of North Carolina

Vengosh, A., et.al., Environmental Science & Techonology Letters, 3(12):409-414, October 2016

Hexavalent chromium [Cr(VI)] is a known pulmonary carcinogen. Recent detection of Cr(VI) in drinking water wells in North Carolina has raised public concern about contamination of drinking water wells by nearby coal ash ponds. Here we report, for the first time, the prevalence of Cr and Cr(VI) in drinking water wells from the Piedmont region of central North Carolina, combined with a geochemical analysis to determine the source of the elevated Cr(VI) levels. We show that Cr(VI) is the predominant species of dissolved Cr in groundwater and elevated levels of Cr and Cr(VI) are found in wells located both near and far (>30 km) from coal ash ponds. The geochemical characteristics, including the overall chemistry, boron to chromium ratios, and strontium isotope (87Sr/86Sr) variations in groundwater with elevated Cr(IV) levels, are different from those of coal ash leachates. Alternatively, the groundwater chemistry and Sr isotope variations are consistent with water–rock interactions as the major source for Cr(VI) in groundwater. Our results indicate that Cr(VI) is most likely naturally occurring and ubiquitous in groundwater from the Piedmont region in the eastern United States, which could pose health risks to residents in the region who consume well water as a major drinking water source.

The precautionary principle and chemicals management: The example of perfluoroalkyl acids in groundwater

Cousins, I.T., et.al., Environment International, 94:331-340, September 2016

Already in the late 1990s microgram-per-liter levels of perfluorooctane sulfonate (PFOS) were measured in water samples from areas where fire-fighting foams were used or spilled. Despite these early warnings, the problems of groundwater, and thus drinking water, contaminated with perfluoroalkyl and polyfluoroalkyl substances (PFASs) including PFOS are only beginning to be addressed. It is clear that this PFAS contamination is poorly reversible and that the societal costs of clean-up will be high. This inability to reverse exposure in a reasonable timeframe is a major motivation for application of the precautionary principle in chemicals management. We conclude that exposure can be poorly reversible; 1) due to slow elimination kinetics in organisms, or 2) due to poorly reversible environmental contamination that leads to continuous exposure. In the second case, which is relevant for contaminated groundwater, the reversibility of exposure is not related to the magnitude of a chemical’s bioaccumulation potential. We argue therefore that all PFASs entering groundwater, irrespective of their perfluoroalkyl chain length and bioaccumulation potential, will result in poorly reversible exposures and risks as well as further clean-up costs for society. To protect groundwater resources for future generations, society should consider a precautionary approach to chemicals management and prevent the use and release of highly persistent and mobile chemicals such as PFASs.

Drinking water lead regulations: impact on the brass value chain

Estelle, A.A., Materials Science and Technology, 32(17):1763-1770, August 2016

A detailed review of regulations restricting the use of lead in potable water systems is provided in several regions including the United States (U.S.), Canada, the European Union (E.U.) and Japan to assess the impact on the brass value chain. Covered topics include: chronology of regulations, governing bodies, compliance requirements, enforcement mechanisms and other aspects relevant to metal suppliers, original equipment manufacturers, designers, specifiers, end-users and recyclers of brass. The development and use of lead-free brass alloys and how these materials have impacted manufacturing and recycling processes is also addressed.

Temporal variation in groundwater quality in the Permian Basin of Texas, a region of increasing unconventional oil and gas development

Hildenbrand, Z.L., et.al., Science of the Total Environment, 562:906-913, August 2016

The recent expansion of natural gas and oil extraction using unconventional oil and gas development (UD) practices such as horizontal drilling and hydraulic fracturing has raised questions about the potential for environmental impacts. Prior research has focused on evaluations of air and water quality in particular regions without explicitly considering temporal variation; thus, little is known about the potential effects of UD activity on the environment over longer periods of time. Here, we present an assessment of private well water quality in an area of increasing UD activity over a period of 13 months. We analyzed samples from 42 private water wells located in three contiguous counties on the Eastern Shelf of the Permian Basin in Texas. This area has experienced a rise in UD activity in the last few years, and we analyzed samples in four separate time points to assess variation in groundwater quality over time as UD activities increased. We monitored general water quality parameters as well as several compounds used in UD activities. We found that some constituents remained stable over time, but others experienced significant variation over the period of study. Notable findings include significant changes in total organic carbon and pH along with ephemeral detections of ethanol, bromide, and dichloromethane after the initial sampling phase. These data provide insight into the potentially transient nature of compounds associated with groundwater contamination in areas experiencing UD activity.

Detection of Poly- and Perfluoroalkyl Substances (PFASs) in U.S. Drinking Water Linked to Industrial Sites, Military Fire Training Areas, and Wastewater Treatment Plants

Andrews, D.Q., et.al., Environmental Science & Technologies Letters, August 2016

Drinking water contamination with poly- and perfluoroalkyl substances (PFASs) poses risks to the developmental, immune, metabolic, and endocrine health of consumers. We present a spatial analysis of 2013–2015 national drinking water PFAS concentrations from the U.S. Environmental Protection Agency’s (US EPA) third Unregulated Contaminant Monitoring Rule (UCMR3) program. The number of industrial sites that manufacture or use these compounds, the number of military fire training areas, and the number of wastewater treatment plants are all significant predictors of PFAS detection frequencies and concentrations in public water supplies. Among samples with detectable PFAS levels, each additional military site within a watershed’s eight-digit hydrologic unit is associated with a 20% increase in PFHxS, a 10% increase in both PFHpA and PFOA, and a 35% increase in PFOS. The number of civilian airports with personnel trained in the use of aqueous film-forming foams is significantly associated with the detection of PFASs above the minimal reporting level. We find drinking water supplies for 6 million U.S. residents exceed US EPA’s lifetime health advisory (70 ng/L) for PFOS and PFOA. Lower analytical reporting limits and additional sampling of smaller utilities serving <10000 individuals and private wells would greatly assist in further identifying PFAS contamination sources.

Cyto- and genotoxic profile of groundwater used as drinking water supply before and after disinfection

Pellacani, C., et.al., Journal of Water and Health, 14(6):901-913, July 2016

The assessment of the toxicological properties of raw groundwater may be useful to predict the type and quality of tap water. Contaminants in groundwater are known to be able to affect the disinfection process, resulting in the formation of substances that are cytotoxic and/or genotoxic. Though the European directive (98/83/EC, which establishes maximum levels for contaminants in raw water (RW)) provides threshold levels for acute exposition to toxic compounds, the law does not take into account chronic exposure at low doses of pollutants, present in complex mixture. The purpose of this study was to evaluate the cyto- and genotoxic load in groundwater of two water treatment plants in Northern Italy. Water samples induced cytotoxic effects, mainly observed when human cells were treated with RW. Moreover, results indicated that the disinfection process reduced cell toxicity, independent of the biocidal used. The induction of genotoxic effects was found, in particular, when the Micronucleous assay was carried out on raw groundwater. These results suggest that it is important to include bio-toxicological assays as additional parameters in water quality monitoring programs, as their use would allow the evaluation of the potential risk of groundwater for humans.

Emerging contaminant uncertainties and policy: The chicken or the egg conundrum

Naidu, R., et.al., Chemosphere, 154:385-390, July 2016

Best practice in regulating contaminants of emerging concern (CEC) must involve the integration of science and policy, be defensible and accepted by diverse stakeholders. Key elements of CEC frameworks include identification and prioritisation of emerging contaminants, evaluation of health and environmental impacts from key matrices such as soil, groundwater, surface waters and sediment, assessments of available data, methods and technologies (and limitations), and mechanisms to take cognisance of diverse interests. This paper discusses one of the few frameworks designed for emerging contaminants, the Minnesota Department of Health (MDH) Drinking Water Contaminants of Emerging Concern (CEC) program. Further review of mechanisms for CECs in other jurisdictions reveals that there is only a small number of regulatory and guidance regimes globally. There is also merit in a formal mechanism for the global exchange of knowledge and outcomes associated with CECs of global interest.

Emerging contaminants in the environment: Risk-based analysis for better management.

Naidu, R., et.al., Chemosphere, 154:350-357, July 2016

Emerging contaminants (ECs) are chemicals of a synthetic origin or deriving from a natural source that has recently been discovered and for which environmental or public health risks are yet to be established. This is due to limited available information on their interaction and toxicological impacts on receptors. Several types of ECs exist such as antibiotics, pesticides, pharmaceuticals, personal care products, effluents, certain naturally occurring contaminants and more recently nanomaterials. ECs may derive from a known source, for example released directly to the aquatic environment from direct discharges such as those from wastewater treatment plants. Although in most instances the direct source cannot be identified, ECs have been detected in virtually every country’s natural environment and as a consequence they represent a global problem. There is very limited information on the fate and transport of ECs in the environment and their toxicological impact. This lack of information can be attributed to limited financial resources and the lack of analytical techniques for detecting their effects on ecosystems and human health on their own or as mixture. We do not know how ECs interact with each other or various contaminants. This paper presents an overview of existing knowledge on ECs, their fate and transport and a risk-based analysis for ECs management and complementary strategies.

Potential corrosivity of untreated groundwater in the United States

Belitz, K., et.al., U.S. Geological Survey Scientific Investigations Report 2016-5092, July 2016

Corrosive groundwater, if untreated, can dissolve lead and other metals from pipes and other components in water distribution systems. Two indicators of potential corrosivity—the Langelier Saturation Index (LSI) and the Potential to Promote Galvanic Corrosion (PPGC)—were used to identify which areas in the United States might be more susceptible to elevated concentrations of metals in household drinking water and which areas might be less susceptible. On the basis of the LSI, about one-third of the samples collected from about 21,000 groundwater sites are classified as potentially corrosive. On the basis of the PPGC, about two-thirds of the samples collected from about 27,000 groundwater sites are classified as moderate PPGC, and about one-tenth as high PPGC. Potentially corrosive groundwater occurs in all 50 states and the District of Columbia. National maps have been prepared to identify the occurrence of potentially corrosive groundwater in the 50 states and the District of Columbia. Eleven states and the District of Columbia were classified as having a very high prevalence of potentially corrosive groundwater, 14 states as having a high prevalence of potentially corrosive groundwater, 19 states as having a moderate prevalence of potentially corrosive groundwater, and 6 states as having a low prevalence of potentially corrosive groundwater. These findings have the greatest implication for people dependent on untreated groundwater for drinking water, such as the 44 million people that are self-supplied and depend on domestic wells or springs for their water supply.

Human Health Risk Assessment of Chromium in Drinking Water: A Case Study of Sukinda Chromite Mine, Odisha, India

Naz, A., et.al., Exposure and Health, 8(2):253-264, June 2016

The present study aims to evaluate human health risk of Cr(VI) and Cr(III) via oral and dermal exposure of drinking water in groundwater samples of nearby Sukinda chromite mine. The risk assessment of each location was carried out using mathematical models as per IRIS guidelines and the input parameters were taken according to the Indian context. The concentrations of TCr and Cr(VI) were found in the range of 48.7–250.2 and 21.4–115.2 μg/l, respectively. In the maximum locations, TCr and Cr(VI) concentrations were found 2.3–6 times and 2.1–11.5 times higher, respectively, than the permissible limit as per standard statutory bodies. The total cumulative average cancer risk and non-cancer risk (Hazard Quotient) was found 2.04E−03 and 1.37 in male and 1.73E−03 and 1.16 in the female population, respectively, which indicated ‘very high’ cancer risk and ‘medium’ non-cancer risk as per USEPA guideline. Male population was found 1.2 times higher cancer and non-cancer risk than females, because of the higher water ingestion rate in male. The obtained health risk via dermal route was found 6 times lesser than the oral ingestion due to very less dermal exposure time (0.58 h/days). As a consequence, ‘high’ cancer risk also recorded in one of the locations where TCr concentration was within permissible limit which is because of the higher proportion of bioavailable Cr(VI). Sensitivity analysis of input parameters towards cancer and non-cancer risk revealed that Cr(VI) and Cr(III) concentrations were the main predominant parameters followed by exposure duration, body weight, average time, and dermal slope factor.

Vulnerability of drinking water supplies to engineered nanoparticles

Troester, M., et.al., Water Research, 96:255-279, June 2016

The production and use of engineered nanoparticles (ENPs) inevitably leads to their release into aquatic environments, with the quantities involved expected to increase significantly in the future. Concerns therefore arise over the possibility that ENPs might pose a threat to drinking water supplies. Investigations into the vulnerability of drinking water supplies to ENPs are hampered by the absence of suitable analytical methods that are capable of detecting and quantifiying ENPs in complex aqueous matrices. Analytical data concerning the presence of ENPs in drinking water supplies is therefore scarce. The eventual fate of ENPs in the natural environment and in processes that are important for drinking water production are currently being investigated through laboratory based-experiments and modelling. Although the information obtained from these studies may not, as yet, be sufficient to allow comprehensive assessment of the complete life-cycle of ENPs, it does provide a valuable starting point for predicting the significance of ENPs to drinking water supplies. This review therefore addresses the vulnerability of drinking water supplies to ENPs. The risk of ENPs entering drinking water is discussed and predicted for drinking water produced from groundwater and from surface water. Our evaluation is based on reviewing published data concerning ENP production amounts and release patterns, the occurrence and behavior of ENPs in aquatic systems relevant for drinking water supply and ENP removability in drinking water purification processes. Quantitative predictions are made based on realistic high-input case scenarios. The results of our synthesis of current knowledge suggest that the risk probability of ENPs being present in surface water resources is generally limited, but that particular local conditions may increase the probability of raw water contamination by ENPs. Drinking water extracted from porous media aquifers are not generally considered to be prone to ENP contamination. In karstic aquifers, however, there is an increased probability that if any ENPs enter the groundwater system they will reach the extraction point of a drinking water treatment plant (DWTP). The ability to remove ENPs during water treatment depends on the specific design of the treatment process. In conventional DWTPs with no flocculation step a proportion of ENPs, if present in the raw water, may reach the final drinking water. The use of ultrafiltration techniques improves drinking water safety with respect to ENP contamination.

The Flint Water Crisis Confirms That U.S. Drinking Water Needs Improved Risk Management

Baum, R., et.al., Environmental Science & Technology, 50(11):5436-5437, May 2016

This article focuses on the existing public health concerns that the current regulatory system has repeatedly failed to address to protect US residents. Water system failures have been extensively analyzed, leading to a conclusion that most could have been prevented with better risk management. Recent research shows the most reported reasons to be because of time and money constraints, but these may also reflect a lack of policy priority shared by the utility and regulator. U.S. public drinking water systems are focused on meeting nationally defined regulations that target certain maximum contaminant levels (MCLs) and specific treatment techniques.

Assessing clarity of message communication for mandated USEPA drinking water quality reports

Davy, B.M, et.al., Journal of Water and Health, 14(2):223-235, April 2016

The United States Environmental Protection Agency mandates that community water systems (CWSs), or drinking water utilities, provide annual consumer confidence reports (CCRs) reporting on water quality, compliance with regulations, source water, and consumer education. While certain report formats are prescribed, there are no criteria ensuring that consumers understand messages in these reports. To assess clarity of message, trained raters evaluated a national sample of 30 CCRs using the Centers for Disease Control Clear Communication Index (Index) indices: (1) Main Message/Call to Action; (2) Language; (3) Information Design; (4) State of the Science; (5) Behavioral Recommendations; (6) Numbers; and (7) Risk. Communication materials are considered qualifying if they achieve a 90% Index score. Overall mean score across CCRs was 50 ± 14% and none scored 90% or higher. CCRs did not differ significantly by water system size. State of the Science (3 ± 15%) and Behavioral Recommendations (77 ± 36%) indices were the lowest and highest, respectively. Only 63% of CCRs explicitly stated if the water was safe to drink according to federal and state standards and regulations. None of the CCRs had passing Index scores, signaling that CWSs are not effectively communicating with their consumers; thus, the Index can serve as an evaluation tool for CCR effectiveness and a guide to improve water quality communications.

Inactivation Kinetics and Replication Cycle Inhibition of Adenovirus by Monochloramine

Gall, A.M., et.al., Environmental Science & Technology Letters, 3.4:185-189, April 2016

Monochloramine is commonly used as a secondary disinfectant to maintain a residual in drinking water distribution systems in the United States. The mechanism by which waterborne viruses become inactivated by monochloramine remains widely unknown. A more fundamental understanding of how viruses become inactivated is necessary for better detection and control of viruses in drinking water. Human adenovirus (HAdV) is known to be the waterborne virus most resistant to monochloramine disinfection, and this study presents inactivation kinetics over a range of environmental conditions. Several steps in the HAdV replication cycle were investigated to determine which steps become inhibited by monochloramine disinfection. Interestingly, monochloramine-inactivated HAdV could bind to host cells, but genome replication and early and late mRNA transcription were inhibited. We conclude that monochloramine exposure inhibited a replication cycle event after binding but prior to early viral protein synthesis.

Determination of dimethyl selenide and dimethyl sulphide compounds causing off-flavours in bottled mineral waters

Guadayol, M., et. al., Water Research, 92 149-155; April 2016

Sales of bottled drinking water have shown a large growth during the last two decades due to the general belief that this kind of water is healthier, its flavour is better and its consumption risk is lower than that of tap water. Due to the previous points, consumers are more demanding with bottled mineral water, especially when dealing with its organoleptic properties, like taste and odour. This work studies the compounds that can generate obnoxious smells, and that consumers have described like swampy, rotten eggs, sulphurous, cooked vegetable or cabbage. Closed loop stripping analysis (CLSA) has been used as a pre-concentration method for the analysis of off-flavour compounds in water followed by identification and quantification by means of GC-MS. Several bottled water with the aforementioned smells showed the presence of volatile dimethyl selenides and dimethyl sulphides, whose concentrations ranged, respectively, from 4 to 20 ng/L and from 1 to 63 ng/L. The low odour threshold concentrations (OTCs) of both organic selenide and sulphide derivatives prove that several objectionable odours in bottled waters arise from them. Microbial loads inherent to water sources, along with some critical conditions in water processing, could contribute to the formation of these compounds. There are few studies about volatile organic compounds in bottled drinking water and, at the best of our knowledge, this is the first study reporting the presence of dimethyl selenides and dimethyl sulphides causing odour problems in bottled waters.

Multimedia exposures to arsenic and lead for children near an inactive mine tailings and smelter site

Loh, M.M., et.al., Environmental Research, Volume 146, p.331–339, April 2016

Children living near contaminated mining waste areas may have high exposures to metals from the environment. This study investigates whether exposure to arsenic and lead is higher in children in a community near a legacy mine and smelter site in Arizona compared to children in other parts of the United States and the relationship of that exposure to the site. Arsenic and lead were measured in residential soil, house dust, tap water, urine, and toenail samples from 70 children in 34 households up to 7 miles from the site. Soil and house dust were sieved, digested, and analyzed via ICP-MS. Tap water and urine were analyzed without digestion, while toenails were washed, digested and analyzed. Blood lead was analyzed by an independent, certified laboratory. Spearman correlation coefficients were calculated between each environmental media and urine and toenails for arsenic and lead. Geometric mean arsenic (standard deviation) concentrations for each matrix were: 22.1 (2.59) ppm and 12.4 (2.27) ppm for soil and house dust.

Reduction in horizontal transfer of conjugative plasmid by UV irradiation and low-level chlorination

Lin, W., et.al., Water Research, 91:331-338, March 2016

The widespread presence of antibiotic resistance genes (ARGs) and antibiotic resistant bacteria (ARB) in the drinking water system facilitates their horizontal gene transfer among microbiota. In this study, the conjugative gene transfer of RP4 plasmid after disinfection including ultraviolet (UV) irradiation and low-level chlorine treatment was investigated. It was found that both UV irradiation and low-level chlorine treatment reduced the conjugative gene transfer frequency. The transfer frequency gradually decreased from 2.75 × 10(-3) to 2.44 × 10(-5) after exposure to UV doses ranging from 5 to 20 mJ/cm(2). With higher UV dose of 50 and 100 mJ/cm(2), the transfer frequency was reduced to 1.77 × 10(-6) and 2.44 × 10(-8). The RP4 plasmid transfer frequency was not significantly affected by chlorine treatment at dosages ranging from 0.05 to 0.2 mg/l, but treatment with 0.3-0.5 mg/l chlorine induced a decrease in conjugative transfer to 4.40 × 10(-5) or below the detection limit. The mechanisms underlying these phenomena were also explored, and the results demonstrated that UV irradiation and chlorine treatment (0.3 and 0.5 mg/l) significantly reduced the viability of bacteria, thereby lowering the conjugative transfer frequency. Although the lower chlorine concentrations tested (0.05-0.2 mg/l) were not sufficient to damage the cells, exposure to these concentrations may still depress the expression of a flagellar gene (FlgC), an outer membrane porin gene (ompF), and a DNA transport-related gene (TraG). Additionally, fewer pili were scattered on the bacteria after chlorine treatment. These findings are important in assessing and controlling the risk of ARG transfer and dissemination in the drinking water system.

Water Disinfection Byproducts Induce Antibiotic Resistance-Role of Environmental Pollutants in Resistance Phenomena

Li, D., et.al., Environmental Science & Technology, 50(6):3193-3201, March 2016

The spread of antibiotic resistance represents a global threat to public health, and has been traditionally attributed to extensive antibiotic uses in clinical and agricultural applications. As a result, researchers have mostly focused on clinically relevant high-level resistance enriched by antibiotics above the minimal inhibitory concentrations (MICs). Here, we report that two common water disinfection byproducts (chlorite and iodoacetic acid) had antibiotic-like effects that led to the evolution of resistant E. coli strains under both high (near MICs) and low (sub-MIC) exposure concentrations. The subinhibitory concentrations of DBPs selected strains with resistance higher than those evolved under above-MIC exposure concentrations. In addition, whole-genome analysis revealed distinct mutations in small sets of genes known to be involved in multiple drug and drug-specific resistance, as well as in genes not yet identified to play role in antibiotic resistance. The number and identities of genetic mutations were distinct for either the high versus low sub-MIC concentrations exposure scenarios. This study provides evidence and mechanistic insight into the sub-MIC selection of antibiotic resistance by antibiotic-like environmental pollutants such as disinfection byproducts in water, which may be important contributors to the spread of global antibiotic resistance. The results from this study open an intriguing and profound question on the roles of large amount and various environmental contaminants play in selecting and spreading the antibiotics resistance in the environment.

Viral persistence in surface and drinking water: Suitability of PCR pre-treatment with intercalating dyes

Prevost, B., et.al., Water Research, March 2016

After many outbreaks of enteric virus associated with consumption of drinking water, the study of enteric viruses in water has increased significantly in recent years. In order to better understand the dynamics of enteric viruses in environmental water and the associated viral risk, it is necessary to estimate viral persistence in different conditions. In this study, two representative models of human enteric viruses, adenovirus 41 (AdV 41) and coxsackievirus B2 (CV-B2), were used to evaluate the persistence of enteric viruses in environmental water. The persistence of infectious particles, encapsidated genomes and free nucleic acids of AdV 41 and CV-B2 was evaluated in drinking water and surface water at different temperatures (4 °C, 20 °C and 37 °C). The infectivity of AdV 41 and CV-B2 persisted for at least 25 days, whatever the water temperature, and for more than 70 days at 4 °C and 20 °C, in both drinking and surface water. Encapsidated genomes persisted beyond 70 days, whatever the water temperature. Free nucleic acids (i.e. without capsid) also were able to persist for at least 16 days in drinking and surface water. The usefulness of a detection method based on an intercalating dye pre-treatment, which specifically targets preserved particles, was investigated for the discrimination of free and encapsidated genomes and it was compared to virus infectivity. Further, the resistance of AdV 41 and CV-B2 against two major disinfection treatments applied in drinking water plants (UV and chlorination) was evaluated. Even after the application of UV rays and chlorine at high doses (400 mJ/cm(2) and 10 mg.min/L, respectively), viral genomes were still detected with molecular biology methods. Although the intercalating dye pre-treatment had little use for the detection of the effects of UV treatment, it was useful in the case of treatment by chlorination and less than 1 log10 difference in the results was found as compared to the infectivity measurements. Finally, for the first time, the suitability of intercalating dye pre-treatment for the estimation of the quality of the water produced by treatment plants was demonstrated using samples from four drinking-water plants and two rivers. Although 55% (27/49) of drinking water samples were positive for enteric viruses using molecular detection, none of the samples were positive when the intercalating dye pre-treatment method was used. This could indicate that the viruses that were detected are not infectious.

Using flow cytometry and Bacteroidales 16S rRNA markers to study the hygienic quality of source water

Baumgartner, A., et.al., Journal für Verbraucherschutz und Lebensmittelsicherheit, 11.1:83-88, March 2016

Six source water fountains in the community of Berne, Switzerland were sampled monthly over the period of 1 year. The samples were tested for total counts by flow cytometry, and for fecal contamination by using the Bacteroidales 16S rRNA markers HF183, BacR and AllBac. The total counts varied considerably between the different fountains with a minimal value of 5115 counts/L and with a maximal count of 198,508 counts/L. The long-term patterns of total counts over 1 year were typical for each fountain. Comparison of rainfall data and data for the non-specific fecal marker AllBac was shown to be a suitable approach to highlight the vulnerability of sources to environmental influences. HF183, indicating contamination of human origin, occurred only sporadically and in insignificant amounts. Furthermore, as indicated by BacR, the studied fountains showed no evidence of contamination by ruminant feces. Further work is suggested in order to establish threshold values for molecular Bacteroidales markers, which could in future replace the currently used criteria for fecal indicator bacteria.

Contrasting regional and national mechanisms for predicting elevated arsenic in private wells across the United States using classification and regression trees

Frederick, L., et. al., Water Research, March 2016

Arsenic contamination in groundwater is a public health and environmental concern in the United States (U.S.) particularly where monitoring is not required under the Safe Water Drinking Act. Previous studies suggest the influence of regional mechanisms for arsenic mobilization into groundwater; however, no study has examined how influencing parameters change at a continental scale spanning multiple regions. We herein examine covariates for groundwater in the western, central and eastern U.S. regions representing mechanisms associated with arsenic concentrations exceeding the U.S. Environmental Protection Agency maximum contamination level (MCL) of 10 parts per billion (ppb). Statistically significant covariates were identified via classification and regression tree (CART) analysis, and included hydrometeorological and groundwater chemical parameters. The CART analyses were performed at two scales: national and regional; for which three physiographic regions located in the western (Payette Section and the Snake River Plain), central (Osage Plains of the Central Lowlands), and eastern (Embayed Section of the Coastal Plains) U.S. were examined. Validity of each of the three regional CART models was indicated by values >85% for the area under the receiver-operating characteristic curve. Aridity (precipitation minus potential evapotranspiration) was identified as the primary covariate associated with elevated arsenic at the national scale. At the regional scale, aridity and pH were the major covariates in the arid to semi-arid (western) region; whereas dissolved iron (taken to represent chemically reducing conditions) and pH were major covariates in the temperate (eastern) region, although additional important covariates emerged, including elevated phosphate. Analysis in the central U.S. region indicated that elevated arsenic concentrations were driven by a mixture of those observed in the western and eastern regions.

Elevated Blood Lead Levels in Children Associated With the Flint Drinking Water Crisis: A Spatial Analysis of Risk and Public Health Response

Hanna-Attisha, M., et. al., American Journal of Public Health, Vol. 106 no. 2, February 2016

We analyzed differences in pediatric elevated blood lead level incidence before and after Flint, Michigan, introduced a more corrosive water source into an aging water system without adequate corrosion control. We reviewed blood lead levels for children younger than 5 years before (2013) and after (2015) water source change in Greater Flint, Michigan. We assessed the percentage of elevated blood lead levels in both time periods, and identified geographical locations through spatial analysis. Incidence of elevated blood lead levels increased from 2.4% to 4.9% ( P< .05) after water source change, and neighborhoods with the highest water lead levels experienced a 6.6% increase. No significant change was seen outside the city. Geospatial analysis identified disadvantaged neighborhoods as having the greatest elevated blood lead level increases and informed response prioritization during the now-declared public health emergency. It was concluded that the percentage of children with elevated blood lead levels increased after water source change, particularly in socioeconomically disadvantaged neighborhoods. Water is a growing source of childhood lead exposure because of aging infrastructure.

DVC-FISH and PMA-qPCR techniques to assess the survival of Helicobacter pylori inside Acanthamoeba castellanii

Moreno-Mesonoro, L., et.al., Research in Microbiology, 167(1):29-34, January 2016

Free-living amoebae (FLA) are ubiquitous microorganisms commonly found in water. They can act as Trojan Horses for some amoeba-resistant bacteria (ARB). Helicobacter pylori is a pathogenic bacteria, suggested to be transmitted through water, which could belong to the ARB group. In this work, a co-culture assay of H. pylori and Acanthamoeba castellanii, one of the most common FLA, was carried out to identify the presence and survival of viable and potentially infective forms of the bacteria internalized by the amoeba. Molecular techniques including FISH, DVC-FISH, qPCR and PMA-qPCR were used to detect the presence of internalized and viable H. pylori. After 24 h in co-culture and disinfection treatment to kill extra-amoebic bacteria, viable H. pylori cells were observed inside A. castellanii. When PMA-qPCR was applied to the co-culture samples, only DNA from internalized H. pylori cells was detected, whereas qPCR amplified total DNA from the sample. By the combined DVC-FISH method, the viability of H. pylori cells in A. castellanii was observed. Both specific techniques provided evidence, for the first time, that the pathogen is able to survive chlorination treatment in occurrence with A. castellanii and could be very useful methods for performing further studies in environmental samples.

Variability in the chemistry of private drinking water supplies and the impact of domestic treatment systems on water quality

Ander, E.L., et.al., Environmental Geochemistry and Health, 38(6):1313-1332, January 2016

Tap water from 497 properties using private water supplies, in an area of metalliferous and arsenic mineralisation (Cornwall, UK), was measured to assess the extent of compliance with chemical drinking water quality standards, and how this is influenced by householder water treatment decisions. The proportion of analyses exceeding water quality standards were high, with 65 % of tap water samples exceeding one or more chemical standards. The highest exceedances for health-based standards were nitrate (11 %) and arsenic (5 %). Arsenic had a maximum observed concentration of 440 µg/L. Exceedances were also high for pH (47 %), manganese (12 %) and aluminium (7 %), for which standards are set primarily on aesthetic grounds. However, the highest observed concentrations of manganese and aluminium also exceeded relevant health-based guidelines. Significant reductions in concentrations of aluminium, cadmium, copper, lead and/or nickel were found in tap waters where households were successfully treating low-pH groundwaters, and similar adventitious results were found for arsenic and nickel where treatment was installed for iron and/or manganese removal, and successful treatment specifically to decrease tap water arsenic concentrations was observed at two properties where it was installed. However, 31 % of samples where pH treatment was reported had pH < 6.5 (the minimum value in the drinking water regulations), suggesting widespread problems with system maintenance. Other examples of ineffectual treatment are seen in failed responses post-treatment, including for nitrate. This demonstrates that even where the tap waters are considered to be treated, they may still fail one or more drinking water quality standards. We find that the degree of drinking water standard exceedances warrant further work to understand environmental controls and the location of high concentrations. We also found that residents were more willing to accept drinking water with high metal (iron and manganese) concentrations than international guidelines assume. These findings point to the need for regulators to reinforce the guidance on drinking water quality standards to private water supply users, and the benefits to long-term health of complying with these, even in areas where treated mains water is widely available.

Human exposure to thallium through tap water: A study from Valdicastello Carducci and Pietrasanta (northern Tuscany, Italy)

Campanella, B., et. al., Science of the Total Environment, January 2016

A geological study evidenced the presence of thallium (Tl) at concentrations of concern in groundwaters near Valdicastello Carducci (Tuscany, Italy). The source of contamination has been identified in the Tl-bearing pyrite ores occurring in the abandoned mining sites of the area. The strongly acidic internal waters flowing in the mining tunnels can reach exceptional Tl concentrations, up to 9000μg/L. In September 2014 Tl contamination was also found in the tap water distributed in the same area (from 2 to 10μg/L). On October 3, 2014 the local authorities imposed a Do Not Drink order to the population. Here we report the results of the exposure study carried out from October 2014 to October 2015, and aimed at quantifying Tl levels in 150 urine and 318 hair samples from the population of Valdicastello Carducci and Pietrasanta. Thallium was quantified by inductively coupled plasma – mass spectrometry (ICP-MS). Urine and hair were chosen as model matrices indicative of different time periods of exposure (short-term and long-term, respectively). Thallium values found in biological samples were correlated with Tl concentrations found in tap water in the living area of each citizen, and with his/her habits. Thallium concentration range found in hair and urine was 1-498ng/g (values in unexposed subjects 0.1-6ng/g) and 0.046-5.44μg/L (reference value for the European population 0.006μg/L), respectively. Results show that Tl levels in biological samples were significantly associated with residency in zones containing elevated water Tl levels. The kinetics of decay of Tl concentration in urine samples was also investigated. At the best of our knowledge, this is the first study on human contamination by Tl through water involving such a high number of samples.

Prevalence and characterization of extended-spectrum beta-lactamase-producing Enterobacteriaceae in spring waters

Li, S., et.al., Letters in Applied Microbiology, 61(6):544-548, December 2015

The purpose of this study was to investigate the prevalence and characterization of extended-spectrum beta-lactamases (ESBL)-producing Enterobacteriaceae from spring waters in Mountain Tai of China. ESBL-producing Enterobacteriaceae were found in four out of 50 sampled spring waters (4/50, 8·0%) and a total of 16 non-duplicate ESBL-producing Enterobacteriaceae were obtained, including 13 Escherichia coli (E. coli) and three Klebsiella pneumoniae (Kl. pneumoniae). All 16 nonduplicate ESBL-producing Enterobacteriaceae isolates harboured genes encoding CTX-M ESBLs, among which six expressed CTX-M-15, five produced CTX-M-14, three produced CTX-M-55 and two expressed CTX-M-27. Four multilocus sequence types (ST) were found and ST131 was the dominant type (8/16, 50·0%). Taken together, the contamination of ESBL-producing Enterobacteriaceae were present in spring waters of Mountain Tai. The results indicated that spring waters could become a reservoir of antibiotic resistant bacteria and contribute to the spread of antimicrobial-resistant bacteria via drinking water or food chain. In addition, wastewater discharge of restaurants or hotels may be an important contribution source of antibiotic resistant bacteria in spring waters.

Qualitative analysis of water quality deterioration and infection by Helicobacter pylori in a community with high risk of stomach cancer (Cauca, Colombia)

Acosta, C.P., et.al., Salud Colectiva, 11(4):575-590, December 2015

This study looks at aspects of the environmental health of the rural population in Timbío (Cauca, Columbia) in relation to the deterioration of water quality. The information was obtained through participatory research methods exploring the management and use of water, the sources of pollution and the perception of water quality and its relation to Helicobacter pylori infection. The results are part of the qualitative analysis of a first research phase characterizing water and sanitation problems and their relation to emerging infectious diseases as well as possible solutions, which was carried out between November 2013 and August 2014. The results of this research are discussed from an ecosystemic approach to human health, recognizing the complexity of environmental conflicts related to water resources and their impacts on the health of populations. Through the methodology used, it is possible to detect and visualize the most urgent problems as well as frequent causes of contamination of water resources so as to propose solutions within a joint agenda of multiple social actors.

Evaluation of alternative DNA extraction processes and real-time PCR for detecting Cryptosporidium parvum in drinking water

Kimble, G.H., Water Science and Technology: Water Supply, 15.6:1295-1303, December 2015

USEPA Method 1623 is the standard method in the United States for the detection of Cryptosporidium in water samples, but quantitative real-time polymerase chain reaction (qPCR) is an alternative technique that has been successfully used to detect Cryptosporidium in aqueous matrices. This study examined various modifications to a commercial nucleic acid extraction procedure in order to enhance PCR detection sensitivity for Cryptosporidium. An alternative DNA extraction buffer allowed for qPCR detection at lower seed levels than a commercial extraction kit buffer. In addition, the use of a second spin column cycle produced significantly better detection (P = 0.031), and the volume of Tris–EDTA buffer significantly affected crossing threshold values (P= 0.001). The improved extraction procedure was evaluated using 10 L of tap water samples processed by ultrafiltration, centrifugation and immunomagnetic separation. Mean recovery for the sample processing method was determined to be 41% using microscopy and 49% by real-time PCR (P = 0.013). The results of this study demonstrate that real-time PCR can be an effective alternative for detecting and quantifying Cryptosporidium parvum in drinking water samples.

Potential applications of next generation DNA sequencing of 16S rRNA gene amplicons in microbial water quality monitoring

Vierheilig, J., et.al., Water Science and Technology, 72.11:1962-1972, December 2015

The applicability of next generation DNA sequencing (NGS) methods for water quality assessment has so far not been broadly investigated. This study set out to evaluate the potential of an NGS-based approach in a complex catchment with importance for drinking water abstraction. In this multi-compartment investigation, total bacterial communities in water, faeces, soil, and sediment samples were investigated by 454 pyrosequencing of bacterial 16S rRNA gene amplicons to assess the capabilities of this NGS method for (i) the development and evaluation of environmental molecular diagnostics, (ii) direct screening of the bulk bacterial communities, and (iii) the detection of faecal pollution in water. Results indicate that NGS methods can highlight potential target populations for diagnostics and will prove useful for the evaluation of existing and the development of novel DNA-based detection methods in the field of water microbiology. The used approach allowed unveiling of dominant bacterial populations but failed to detect populations with low abundances such as faecal indicators in surface waters. In combination with metadata, NGS data will also allow the identification of drivers of bacterial community composition during water treatment and distribution, highlighting the power of this approach for monitoring of bacterial regrowth and contamination in technical systems.

Qualitative analysis of water quality deterioration and infection by Helicobacter pylori in a community with high risk of stomach cancer (Cauca, Colombia)

Acosta, C.P., et. al., Salud Colectiva, 11 (4):575-590, December 2015

This study looks at aspects of the environmental health of the rural population in Timbio (Cauca, Columbia) in relation to the deterioration of water quality. The information was obtained through participatory research methods exploring the management and use of water, the sources of pollution and the perception of water quality and its relation to Helicobacter pylori infection. The results are part of the qualitative analysis of a first research phase characterizing water and sanitation problems and their relation to emerging infectious diseases as well as possible solutions, which was carried out between November 2013 and August 2014. The results of this research are discussed from an ecosystemic approach to human health, recognizing the complexity of environmental conflicts related to water resources and their impacts on the health of populations. Through the methodology used, it is possible to detect and visualize the most urgent problems as well as frequent causes of contamination of water resources so as to propose solutions within a joint agenda of multiple social actors.

Impacts of hydraulic fracturing on water quality: a review of literature, regulatory frameworks and an analysis of information gaps

Gagnon, G.A., et.al., Environmental Reviews, 24(2):122-131, November 2015

A review of available literature and current governance approaches related to the potential impacts of hydraulic fracturing on water quality (including drinking water) was developed. The paper identifies gaps in literature and (or) current governance approaches that should be addressed to guide decision-makers in the development of appropriate regulatory regimes that will enable assessment of the impacts of hydraulic fracturing on water quality. The lack of credible and comprehensive data are shown to have been a major setback to properly investigate and monitor hydraulic fracturing activities and their potential risks on the environment and water quality. A review of current governance approaches demonstrates that some jurisdictions have implemented baseline and post-operation water quality monitoring requirements; however, there are large variations in site-specific monitoring requirements across Canada and the United States. In light of recent information, a targeted approach is suggested based on risk priorities, which can prioritize sample collection and frequency, target contaminants, and the needed duration of the sampling. The steps outlined in this review help to interface with the public concerns associated with water quality, and appropriately ensure that public health is protected through appropriate water safety planning.

Estimating Potential Increased Bladder Cancer Risk Due to Increased Bromide Concentrations in Sources of Disinfected Drinking Waters

Regli, S., et.al., Environmental Science & Technology, 49.22:13094-13102, November 2015

Public water systems are increasingly facing higher bromide levels in their source waters from anthropogenic contamination through coal-fired power plants, conventional oil and gas extraction, textile mills, and hydraulic fracturing. Climate change is likely to exacerbate this in coming years. We estimate bladder cancer risk from potential increased bromide levels in source waters of disinfecting public drinking water systems in the United States. Bladder cancer is the health end point used by the United States Environmental Protection Agency (EPA) in its benefits analysis for regulating disinfection byproducts in drinking water. We use estimated increases in the mass of the four regulated trihalomethanes (THM4) concentrations (due to increased bromide incorporation) as the surrogate disinfection byproduct (DBP) occurrence metric for informing potential bladder cancer risk. We estimate potential increased excess lifetime bladder cancer risk as a function of increased source water bromide levels. Results based on data from 201 drinking water treatment plants indicate that a bromide increase of 50 μg/L could result in a potential increase of between 10(-3) and 10(-4) excess lifetime bladder cancer risk in populations served by roughly 90% of these plants.

Solar Disinfection of Viruses in Polyethylene Terephthalate Bottles

Carratala, A., et.al., Applied and Environmental Microbiology, 82(1):279-288, October 2015

Solar disinfection (SODIS) of drinking water in polyethylene terephthalate (PET) bottles is a simple, efficient point-of-use technique for the inactivation of many bacterial pathogens. In contrast, the efficiency of SODIS against viruses is not well known. In this work, we studied the inactivation of bacteriophages (MS2 and ϕX174) and human viruses (echovirus 11 and adenovirus type 2) by SODIS. We conducted experiments in PET bottles exposed to (simulated) sunlight at different temperatures (15, 22, 26, and 40°C) and in water sources of diverse compositions and origins (India and Switzerland). Good inactivation of MS2 (>6-log inactivation after exposure to a total fluence of 1.34 kJ/cm(2)) was achieved in Swiss tap water at 22°C, while less-efficient inactivation was observed in Indian waters and for echovirus (1.5-log inactivation at the same fluence). The DNA viruses studied, ϕX174 and adenovirus, were resistant to SODIS, and the inactivation observed was equivalent to that occurring in the dark. High temperatures enhanced MS2 inactivation substantially; at 40°C, 3-log inactivation was achieved in Swiss tap water after exposure to a fluence of only 0.18 kJ/cm(2). Overall, our findings demonstrate that SODIS may reduce the load of single-stranded RNA (ssRNA) viruses, such as echoviruses, particularly at high temperatures and in photoreactive matrices. In contrast, complementary measures may be needed to ensure efficient inactivation during SODIS of DNA viruses resistant to oxidation.

Regulation of non-relevant metabolites of plant protection products in drinking and groundwater in the EU: Current status and way forward

Laabs, V., et.al., Regulatory Toxicology and Pharmacology, 73.1:276-286, October 2015

Non-relevant metabolites are defined in the EU regulation for plant protection product authorization and a detailed definition of non-relevant metabolites is given in an EU Commission DG Sanco (now DG SANTE – Health and Food Safety) guidance document. However, in water legislation at EU and member state level non-relevant metabolites of pesticides are either not specifically regulated or diverse threshold values are applied. Based on their inherent properties, non-relevant metabolites should be regulated based on substance-specific and toxicity-based limit values in drinking and groundwater like other anthropogenic chemicals. Yet, if a general limit value for non-relevant metabolites in drinking and groundwater is favored, an application of a Threshold of Toxicological Concern (TTC) concept for Cramer class III compounds leads to a threshold value of 4.5 μg L(-1). This general value is exemplarily shown to be protective for non-relevant metabolites, based on individual drinking water limit values derived for a set of 56 non-relevant metabolites. A consistent definition of non-relevant metabolites of plant protection products, as well as their uniform regulation in drinking and groundwater in the EU, is important to achieve legal clarity for all stakeholders and to establish planning security for development of plant protection products for the European market.

Incidence of waterborne lead in private drinking water systems in Virginia

Pieper, K.J., et.al, Journal of Water and Health, 13(3):897-908, September 2015

Although recent studies suggest contamination by bacteria and nitrate in private drinking water systems is of increasing concern, data describing contaminants associated with the corrosion of onsite plumbing are scarce. This study reports on the analysis of 2,146 samples submitted by private system homeowners. Almost 20% of first draw samples submitted contained lead concentrations above the United States Environmental Protection Agency action level of 15 μg/L, suggesting that corrosion may be a significant public health problem. Correlations between lead, copper, and zinc suggested brass components as a likely lead source, and dug/bored wells had significantly higher lead concentrations as compared to drilled wells. A random subset of samples selected to quantify particulate lead indicated that, on average, 47% of lead in the first draws was in the particulate form, although the occurrence was highly variable. While flushing the tap reduced lead below 15 μg/L for most systems, some systems experienced an increase, perhaps attributable to particulate lead or lead-bearing components upstream of the faucet (e.g., valves, pumps). Results suggest that without including a focus on private as well as municipal systems it will be very difficult to meet the existing national public health goal to eliminate elevated blood lead levels in children.

Presence of antibiotic resistant bacteria and antibiotic resistance genes in raw source water and treated drinking water

Bergeron, S., et.al., International Biodeterioration & Biodegredation, 102:370-374August 2015

Antibiotic resistance is becoming a very large problem throughout the world. The spread of antibiotic resistant bacteria (ARB) and antibiotic resistance genes (ARGs) in the environment is a major public health issue. Aquatic ecosystem is a significant source for ARB and ARGs. The drinking water treatment system is designed specifically to eliminate bacteria and pathogens in drinking water. The presence of ARB and ARGs in source water and drinking water may affect public health and it is an emerging issue in drinking water industry. Therefore, this study was conducted to study the presence of ARB and ARGs in a source water, treated drinking water (finished water), and in the distribution line (tap water) in a rural water treatment plant in Louisiana. The results showed the presence of several ARB in the source water including, Enterobacter cloacae, Klebsiella pneumoniae, Escherichia coli, Pseudomonas, Enterococcus, Staphylococcus and Bacillus spp. However, the water treatment plant effectively removed these bacteria in the treated water as none of these bacteria were found in the tap water as well as in the finished water at the water treatment plant. Bacterial DNA including 16s rRNA and ARGs of sulfonamides and tetracycline antibiotics were observed in raw water. The presence of 16s rRNA was found consistently in every month of sampling in raw water, finished water, and tap water. This suggests that the filtration system at the treatment plant was ineffective in filtering out small fragments of bacterial DNA. Also, the possibility of the presence of biofilms in the water pipeline exists, which may develop antibiotic resistance due to the selective pressure of chlorination in drinking water.

Surveillance of perchlorate in ground water, surface water and bottled water in Kerala, India

Nadaraja, A.V., et.al., Journal of Environmental Health Science and Engineering, July 2015

Perchlorate is an emerging water contaminant that disrupts normal functioning of human thyroid gland and poses serious threat to health, especially for pregnant women, fetus and children. High level of perchlorate contamination in fresh water sources at places nearby ammonium perchlorate (rocket fuel) handled in bulk is reported in this study. Of 160 ground water samples analyzed from 27 locations in the State Kerala, 58 % had perchlorate above detection limit (2 μg/L) and the highest concentration observed was 7270 μg/L at Ernakulam district, this value is ~480 times higher than USEPA drinking water equivalent level (15 μg/L). Perchlorate was detected in all surface water samples analyzed (n = 10) and the highest value observed was 355 μg/L in Periyar river (a major river in the State). The bottled drinking water (n = 5) tested were free of perchlorate. The present study underlines the need for frequent screening of water sources for perchlorate contamination around places the chemical is handled in bulk. It will help to avoid human exposure to high levels of perchlorate.

Waterborne outbreaks in the Nordic countries, 1998 to 2012

Guzman-Herrador, B., et.al., Eurosurveillance, 20.24, June 2015

A total of 175 waterborne outbreaks affecting 85,995 individuals were notified to the national outbreak surveillance systems in Denmark, Finland and Norway from 1998 to 2012, and in Sweden from 1998 to 2011. Between 4 and 18 outbreaks were reported each year during this period. Outbreaks occurred throughout the countries in all seasons, but were most common (n = 75/169, 44%) between June and August. Viruses belonging to the Caliciviridae family and Campylobacter were the pathogens most frequently involved, comprising n = 51 (41%) and n = 36 (29%) of all 123 outbreaks with known aetiology respectively. Although only a few outbreaks were caused by parasites (Giardia and/or Cryptosporidium), they accounted for the largest outbreaks reported during the study period, affecting up to 53,000 persons. Most outbreaks, 124 (76%) of those with a known water source (n = 163) were linked to groundwater. A large proportion of the outbreaks (n = 130/170, 76%) affected a small number of people (less than 100 per outbreak) and were linked to single-household water supplies. However, in 11 (6%) of the outbreaks, more than 1,000 people became ill. Although outbreaks of this size are rare, they highlight the need for increased awareness, particularly of parasites, correct water treatment regimens, and vigilant management and maintenance of the water supply and distribution systems.

Microbial Health Risks of Regulated Drinking Waters in the United States — A Comparative Microbial Safety Assessment of Public Water Supplies and Bottled Water

Edberg, S.C., Topics in Public Health, June 2015

The quality of drinking water in the United States (U.S.) is extensively monitored and regulated by federal, state and local agencies, yet there is increasing public concern and confusion about the safety and quality of drinking water –– both from public water systems and from bottled water products. In the U.S., tap water and bottled water are regulated by two different agencies: the Environmental Protection Agency (EPA) regulates public water system water (tap water) and the Food and Drug Administration (FDA) regulates bottled water. Federal law requires that the FDA’s regulations for bottled water must be at least as protective of public health as EPA standards for tap water.

Performance Evaluation of an Italian Reference Method, the ISO Reference Method and a Chromogenic Rapid Method for the Detection of E. coli and Coliforms in Bottled Water

Di Pasquale, S. and Dario De Medici, Food Analytical Methods, 8.10:2417-2426, April 2015

Bottled water can be contaminated by coliforms and/or Escherichia coli (E. coli). These bacteria are considered as indicators of faecal pollution, and their detection in bottled water indicates the potential contamination by pathogenic enteric microorganisms. In recent decades, different methods were developed for the detection of coliforms and E. coli in drinking water and in bottled water including mineral water. Since 1976, the Italian regulation has defined microbiological methods to evaluate microbiological characteristics of mineral waters. Three different methods for the detection of coliforms and E. coli in bottled water were compared in this study: the Italian reference method, according to the “Italian Ministerial Rule,” the ISO 9308–1:2002 method, and a new rapid method. The results have demonstrated that the ISO method 9308–1:2002 and the new rapid method are as sensitive and specific as Italian reference method, and that both could be used to evaluate the contamination level of coliform and E. coli in drinking water and in bottled water including mineral water.

Microbial diversity and dynamics of a groundwater and a still bottled natural mineral water

Franca, L., et.al., Environmental Microbiology, 17.3:577-593, March 2015

The microbial abundance and diversity at source, after bottling and through 6 months of storage of a commercial still natural mineral water were assessed by culture-dependent and culture-independent methods. The results revealed clear shifts of the dominant communities present in the three different stages. The borehole waters displayed low cell densities that increased 1.5-fold upon bottling and storage, reaching a maximum (6.2 × 108 cells l−1) within 15 days after bottling, but experienced a significant decrease in diversity. In all cases, communities were largely dominated by Bacteria. The culturable heterotrophic community was characterized by recovering 3626 isolates, which were primarily affiliated with the Alphaproteobacteria, Betaproteobacteria and Gammaproteobacteria. This study indicates that bottling and storage induce quantitative and qualitative changes in the microbial assemblages that seem to be similar as revealed by the two sample batches collected on 2 consecutive years. To our knowledge, this is the first study combining culture-independent with culture-dependent methods, and repeated tests to reveal the microbial dynamics occurring from source to stored bottled water.

Molecular detection of Helicobacter pylori in a large Mediterranean river, by direct viable count fluorescent in situ hybridization (DVC-FISH

Tirodimos, L., et.al., Journal of Water and Health, 12.4:868-873, December 2014

Although the precise route and mode of transmission of Helicobacter pylori are still unclear, molecular methods have been applied for the detection of H. pylori in environmental samples. In this study, we used the direct viable count fluorescent in situ hybridization (DVC-FISH) method to detect viable cells of H. pylori in the River Aliakmon, Greece. This is the longest river in Greece, and provides potable water in metropolitan areas. H. pylori showed positive detection for 23 out of 48 water samples (47.9%), while no seasonal variation was found and no correlation was observed between the presence of H. pylori and indicators of fecal contamination. Our findings strengthen the evidence that H. pylori is waterborne while its presence adds to the potential health hazards of the River Aliakmon.

Chromium in drinking water: association with biomarkers of exposure and effect

Sazakli, E., et.al., International Journal of Environmental Research and Public Health, 11.10:10125-10145, October 2014

An epidemiological cross-sectional study was conducted in Greece to investigate health outcomes associated with long-term exposure to chromium via drinking water. The study population consisted of 304 participants. Socio-demographics, lifestyle, drinking water intake, dietary habits, occupational and medical history data were recorded through a personal interview. Physical examination and a motor test were carried out on the individuals. Total chromium concentrations were measured in blood and hair of the study subjects. Hematological, biochemical and inflammatory parameters were determined in blood. Chromium in drinking water ranged from <0.5 to 90 μg·L-1 in all samples but one (220 μg·L-1), with a median concentration of 21.2 μg·L-1. Chromium levels in blood (median 0.32 μg·L-1, range <0.18-0.92 μg·L-1) and hair (median 0.22 μg·g-1, range 0.03-1.26 μg·g-1) were found within “normal range” according to the literature. Personal lifetime chromium exposure dose via drinking water, calculated from the results of the water analyses and the questionnaire data, showed associations with blood and hair chromium levels and certain hematological and biochemical parameters. Groups of subjects whose hematological or biochemical parameters were outside the normal range were not correlated with chromium exposure dose, except for groups of subjects with high triglycerides or low sodium. Motor impairment score was not associated with exposure to chromium.

Naegleria fowleri: An emerging drinking water pathogen

Bartrand, T., et.al., American Water Works Association Journal, 106.10:418-432, October 2014

Naegleria fowleri (N. fowleri) is a free-living, trophic amoeba that is nearly ubiquitous in the environment and can be present in high numbers in warm waters. It is the causative agent of primary amoebic meningoencephalitis (PAM), a rare but particularly lethal disease with a very low survival incidence. Although N. fowleri was isolated from drinking water supplies in Australia in the 1980s, it was not considered a drinking water threat in the United States until recent cases were associated with a groundwater system in Arizona and surface water systems in Louisiana. N. fowleri in drinking water treatment and distribution systems can be managed using disinfectant concentrations typically encountered in well-run plants although nitrification and attendant low disinfectant residuals may pose a challenge for some systems. The greatest challenge for N. fowleri control is in premise plumbing systems where conditions are largely outside the control of utilities, residuals might be low or nonexistent, and where water temperatures could be high enough to support rapid growth of the amoebae. This article reviews published studies describing the environmental occurrence, survival, pathogenicity, and disinfection of N. fowleri. In addition, this article provides information about this little known and poorly understood parasite with respect to its occurrence in the environment; how the amoeba amplifies in water systems such that it can cause infection; how N. fowleri has been successfully controlled for decades in water systems through treatment and distribution system management in Australia; and the knowledge gaps and information needed to address N. fowleri as an emerging pathogen in US water supplies.

Emerging Trends in Groundwater Pollution and Quality

Kurwadkar, S., Water Environment Research, 86.10:1677-1691, October 2014

Groundwater pollution due to anthropogenic activities may impact overall groundwater quality. Organic and inorganic pollutants have been routinely detected at unsafe levels in groundwater rendering this important drinking water resource practically unusable. Vulnerability of groundwater pollution and subsequent impact has been documented in various studies across the globe. Field studies as well as mathematical models have demonstrated increasing levels of pollutants in both shallow and deep aquifer systems. New emerging pollutants such as organic micro-pollutants have also been detected in some industrialized as well as in developing countries. Increased vulnerability coupled with ever growing demand for groundwater may pose a greater threat of pollution due to induced recharge and lack of environmental safeguards to protect groundwater sources. In this review paper, comprehensive assessment of groundwater quality impact due to human activities such as improper management of organic and inorganic waste, and natural sources is documented. A detailed review of published reports and peer reviewed journal papers across the world clearly demonstrate that groundwater quality is declining over time. A proactive approach is needed to prevent human health and ecological consequences due to ingestion of contaminated groundwater.

Emerging Trends in Groundwater Pollution and Quality

Kurwadkar, S., Water Environment Research, pp. 1677-1691(15), October 2014

Groundwater pollution due to anthropogenic activities may impact overall groundwater quality. Organic and inorganic pollutants have been routinely detected at unsafe levels in groundwater rendering this important drinking water resource practically unusable. Vulnerability of groundwater pollution and subsequent impact has been documented in various studies across the globe. Field studies as well as mathematical models have demonstrated increasing levels of pollutants in both shallow and deep aquifer systems. New emerging pollutants such as organic micro-pollutants have also been detected in some industrialized as well as in developing countries. Increased vulnerability coupled with ever growing demand for groundwater may pose a greater threat of pollution due to induced recharge and lack of environmental safeguards to protect groundwater sources. In this review paper, comprehensive assessment of groundwater quality impact due to human activities such as improper management of organic and inorganic waste, and natural sources is documented. A detailed review of published reports and peer reviewed journal papers across the world clearly demonstrate that groundwater quality is declining over time. A proactive approach is needed to prevent human health and ecological consequences due to ingestion of contaminated groundwater.

Evaluation of long-term (1960-2010) groundwater fluoride contamination in Texas

Chaudhuri, S., and Srinivasulu Ale, Journal of Environmental Quality, 43.4:1404-1416, August 2014

Groundwater quality degradation is a major threat to sustainable development in Texas. The aim of this study was to elucidate spatiotemporal patterns of groundwater fluoride (F) contamination in different water use classes in 16 groundwater management areas in Texas between 1960 and 2010. Groundwater F concentration data were obtained from the Texas Water Development Board and aggregated over a decadal scale. Our results indicate that observations exceeding the drinking water quality threshold of World Health Organization (1.5 mg F L) and secondary maximum contaminant level (SMCL) (2 mg F L) of the USEPA increased from 26 and 19% in the 1960s to 37 and 23%, respectively, in the 2000s. In the 2000s, F observations > SMCL among different water use classes followed the order: irrigation (39%) > domestic (20%) > public supply (17%). Extent and mode of interaction between F and other water quality parameters varied regionally. In western Texas, high F concentrations were prevalent at shallower depths (<50 m) and were positively correlated with bicarbonate (HCO) and sulfate anions. In contrast, in southern and southeastern Texas, higher F concentrations occurred at greater depths (>50 m) and were correlated with HCO and chloride anions. A spatial pattern has become apparent marked by “excess” F in western Texas groundwaters as compared with “inadequate” F contents in rest of the state. Groundwater F contamination in western Texas was largely influenced by groundwater mixing and evaporative enrichment as compared with water-rock interaction and mineral dissolution in the rest of the state.

Ground water contamination with (238)U, (234)U, (235)U, (226)Ra and (210)Pb from past uranium mining: cove wash, Arizona

da Cunha, K.M.D., et.al., Environmental Geochemistry and Health, 36.3:477-487, June 2014

The objectives of the study are to present a critical review of the (238)U, (234)U, (235)U, (226)Ra and (210)Pb levels in water samples from the EPA studies (U.S. EPA in Abandoned uranium mines and the Navajo Nation: Red Valley chapter screening assessment report. Region 9 Superfund Program, San Francisco, 2004, Abandoned uranium mines and the Navajo Nation: Northern aum region screening assessment report. Region 9 Superfund Program, San Francisco, 2006, Health and environmental impacts of uranium contamination, 5-year plan. Region 9 Superfund Program, San Franciso, 2008) and the dose assessment for the population due to ingestion of water containing (238)U and (234)U. The water quality data were taken from Sect. “Data analysis” of the published report, titled Abandoned Uranium Mines Project Arizona, New Mexico, Utah-Navajo Lands 1994-2000, Project Atlas. Total uranium concentration was above the maximum concentration level for drinking water (7.410-1 Bq/L) in 19 % of the water samples, while (238)U and (234)U concentrations were above in 14 and 17 % of the water samples, respectively. (226)Ra and (210)Pb concentrations in water samples were in the range of 3.7 × 10(-1) to 5.55 × 102 Bq/L and 1.11 to 4.33 × 102 Bq/L, respectively. For only two samples, the (226)Ra concentrations exceeded the MCL for total Ra for drinking water (0.185 Bq/L). However, the (210)Pb/(226)Ra ratios varied from 0.11 to 47.00, and ratios above 1.00 were observed in 71 % of the samples. Secular equilibrium of the natural uranium series was not observed in the data record for most of the water samples. Moreover, the (235)U/(total)U mass ratios ranged from 0.06 to 5.9 %, and the natural mass ratio of (235)U to (total)U (0.72 %) was observed in only 16 % of the water samples, ratios above or below the natural ratio could not be explained based on data reported by U.S. EPA. In addition, statistical evaluations showed no correlations among the distribution of the radionuclide concentrations in the majority of the water samples, indicating more than one source of contamination could contribute to the sampled sources. The effective doses due to ingestion of the minimum uranium concentrations in water samples exceed the average dose considering inhalation and ingestion of regular diet for other populations around the world (1 μSv/year). The maximum doses due to ingestion of (238)U or (234)U were above the international limit for effective dose for members of the public (1 mSv/year), except for inhabitants of two chapters. The highest effective dose was estimated for inhabitants of Cove, and it was almost 20 times the international limit for members of the public. These results indicate that ingestion of water from some of the sampled sources poses health risks.

Contamination of Groundwater Systems in the US and Canada by Enteric Pathogens, 1990–2013: A Review and Pooled-Analysis

Hynds, P.D., Thomas, M.K., Pintar, K.D.M., PLOS ONE,Vol. 9, issue 5, e93301, May 2014

A combined review and pooled-analysis approach was used to investigate groundwater contamination in Canada and the US from 1990 to 2013; fifty-five studies met eligibility criteria. Four study types were identified. It was found that study location affects study design, sample rate and studied pathogen category. Approximately 15% (316/2210) of samples from Canadian and US groundwater sources were positive for enteric pathogens, with no difference observed based on system type. Knowledge gaps exist, particularly in exposure assessment for attributing disease to groundwater supplies. Furthermore, there is a lack of consistency in risk factor reporting (local hydrogeology, well type, well use, etc). The widespread use of fecal indicator organisms in reported studies does not inform the assessment of human health risks associated with groundwater supplies. This review illustrates how groundwater study design and location are critical for subsequent data interpretation and use. Knowledge gaps exist related to data on bacterial, viral and protozoan pathogen prevalence in Canadian and US groundwater systems, as well as a need for standardized approaches for reporting study design and results. Fecal indicators are examined as a surrogate for health risk assessments; caution is advised in their widespread use. Study findings may be useful during suspected waterborne outbreaks linked with a groundwater supply to identify the likely etiological agent and potential transport pathway.

Large Outbreak of Cryptosporidium hominis Infection Transmitted through the Public Water Supply, Sweden

Widerström, M., et.al., Emerging Infectious Diseases,Vol 20 No 4, April 2014

In November 2010, ≈27,000 (≈45%) inhabitants of Östersund, Sweden, were affected by a waterborne outbreak of cryptosporidiosis. The outbreak was characterized by a rapid onset and high attack rate, especially among young and middle-aged persons. Young age, number of infected family members, amount of water consumed daily, and gluten intolerance were identified as risk factors for acquiring cryptosporidiosis. Also, chronic intestinal disease and young age were significantly associated with prolonged diarrhea. Identification of Cryptosporidium hominis subtype IbA10G2 in human and environmental samples and consistently low numbers of oocysts in drinking water confirmed insufficient reduction of parasites by the municipal water treatment plant. The current outbreak shows that use of inadequate microbial barriers at water treatment plants can have serious consequences for public health. This risk can be minimized by optimizing control of raw water quality and employing multiple barriers that remove or inactivate all groups of pathogens.

Microbial Contamination Detection in Water Resources: Interest of Current Optical Methods, Trends and Needs in the Context of Climate Change

Jung, A.V., et al., International Journal of Environmental Research and Public Health, 11(4), 4292-4310, April 2014

Microbial pollution in aquatic environments is one of the crucial issues with regard to the sanitary state of water bodies used for drinking water supply, recreational activities and harvesting seafood due to a potential contamination by pathogenic bacteria, protozoa or viruses. To address this risk, microbial contamination monitoring is usually assessed by turbidity measurements performed at drinking water plants. Some recent studies have shown significant correlations of microbial contamination with the risk of endemic gastroenteresis. However the relevance of turbidimetry may be limited since the presence of colloids in water creates interferences with the nephelometric response. Thus there is a need for a more relevant, simple and fast indicator for microbial contamination detection in water, especially in the perspective of climate change with the increase of heavy rainfall events. This review focuses on the one hand on sources, fate and behavior of microorganisms in water and factors influencing pathogens’ presence, transportation and mobilization, and on the second hand, on the existing optical methods used for monitoring microbiological risks. Finally, this paper proposes new ways of research.

Assessing Exposure and Health Consequences of Chemicals in Drinking Water: Current State of Knowledge and Research Needs

Villanueva, C.M., et.al., Environmental Health Perspectives, 122.3:213-221, March 2014

Safe drinking water is essential for well-being. Although microbiological contamination remains the largest cause of water-related morbidity and mortality globally, chemicals in water supplies may also cause disease, and evidence of the human health consequences is limited or lacking for many of them.We aimed to summarize the state of knowledge, identify gaps in understanding, and provide recommendations for epidemiological research relating to chemicals occurring in drinking water. Assessing exposure and the health consequences of chemicals in drinking water is challenging. Exposures are typically at low concentrations, measurements in water are frequently insufficient, chemicals are present in mixtures, exposure periods are usually long, multiple exposure routes may be involved, and valid biomarkers reflecting the relevant exposure period are scarce. In addition, the magnitude of the relative risks tends to be small. Research should include well-designed epidemiological studies covering regions with contrasting contaminant levels and sufficient sample size; comprehensive evaluation of contaminant occurrence in combination with bioassays integrating the effect of complex mixtures; sufficient numbers of measurements in water to evaluate geographical and temporal variability; detailed information on personal habits resulting in exposure (e.g., ingestion, showering, swimming, diet); collection of biological samples to measure relevant biomarkers; and advanced statistical models to estimate exposure and relative risks, considering methods to address measurement error. Last, the incorporation of molecular markers of early biological effects and genetic susceptibility is essential to understand the mechanisms of action. There is a particular knowledge gap and need to evaluate human exposure and the risks of a wide range of emerging contaminants.

Spatial analysis of boil water advisories issued during an extreme weather event in the Hudson River Watershed, USA

Vedachalam, S., et.al., Applied Geography, 48:112-121, March 2014

Water infrastructure in the United States is aging and vulnerable to extreme weather. In August 2011, Tropical Storm Irene hit the eastern part of New York and surrounding states, causing great damage to public drinking water systems. Several water supply districts issued boil water advisories (BWAs) to their customers as a result of the storm. This study seeks to identify the major factors that lead water supply systems to issue BWAs by assessing watershed characteristics, water supply system characteristics and treatment plant parameters of water districts in the Mohawk-Hudson River watershed in New York. Logistic regression model suggests that the probability of a BWA being issued by a water supply district is enhanced by higher precipitation during the storm, high density of septic systems, lack of recent maintenance and low population density. Interviews with water treatment plant operators suggested physical damage to water distribution systems were the main causes of boil water advisories during storms. BWAs result in additional costs to residents and communities, and the public compliance of the advisory instructions is low, so efforts must be made to minimize their occurrence. Prior investments in infrastructure management can proactively address municipal water supply and quality issues.


Epidemiology and estimated costs of a large waterborne outbreak of norovirus infection in Sweden

Larsson, C., et al., Epidemiology and Infection, 142(3):592-600, March 2014

A large outbreak of norovirus (NoV) gastroenteritis caused by contaminated municipal drinking water occurred in Lilla Edet, Sweden, 2008. Epidemiological investigations performed using a questionnaire survey showed an association between consumption of municipal drinking water and illness (odds ratio 4·73, 95% confidence interval 3·53-6·32), and a strong correlation between the risk of being sick and the number of glasses of municipal water consumed. Diverse NoV strains were detected in stool samples from patients, NoV genotype I strains predominating. Although NoVs were not detected in water samples, coliphages were identified as a marker of viral contamination. About 2400 (18·5%) of the 13,000 inhabitants in Lilla Edet became ill. Costs associated with the outbreak were collected via a questionnaire survey given to organizations and municipalities involved in or affected by the outbreak. Total costs including sick leave, were estimated to be ∼8,700,000 Swedish kronor (∼€0·87 million).

Methyl Tertiary Butyl Ether (MTBE) and Other Volatile Organic Compounds (VOCs) in Public Water Systems, Private Wells, and Ambient Groundwater Wells in New Jersey Compared to Regulatory and Human-Health Benchmarks

Williams, P.R.D., Environmental Forensics, Volume 15, Issue 1, February 2014

Potential threats to drinking water and water quality continue to be a major concern in many regions of the United States. New Jersey, in particular, has been at the forefront of assessing and managing potential contamination of its drinking water supplies from hazardous substances. The purpose of the current analysis is to provide an up-to-date evaluation of the occurrence and detected concentrations of methyl tertiary butyl ether (MTBE) and several other volatile organic compounds (VOCs) in public water systems, private wells, and ambient groundwater wells in New Jersey based on the best available data, and to put these results into context with federal and state regulatory and human-health benchmarks. Analyses are based on the following three databases that contain water quality monitoring data for New Jersey: Safe Drinking Water Information System (SDWIS), Private Well Testing Act (PWTA), and National Water Information System (NWIS). For public water systems served by groundwater in New Jersey, MTBE was detected at a concentration ≥10 μg/L, ≥20 μg/L, and ≥70 μg/L at least once in 30 (2%), 21 (1.4%), and five (0.3%) of sampled systems from 1997 to 2011, respectively. For private wells in New Jersey, MTBE was detected at a concentration ≥10 μg/L, ≥20 μg/L, and ≥70 μg/L at least once in 385 (0.5%), 183 (0.2%), and 46 (0.05%) of sampled wells from 2001 to 2011, respectively. For ambient groundwater wells in New Jersey, MTBE was detected at a concentration ≥10 μg/L, ≥20 μg/L, and ≥70 μg/L at least once in 14 (2.1%), 9 (1.3%), and 4 (0.6%) of sampled wells from 1993 to 2012, respectively. Average detected concentrations of MTBE, as well as detected concentrations at upper-end percentiles, were less than corresponding benchmarks for all three datasets. The available data show that MTBE is rarely detected in various source waters in New Jersey at a concentration that exceeds the State’s health-based drinking water standard or other published benchmarks, and there is no evidence of an increasing trend in the detection frequency of MTBE. Other VOCs, such as tetrachloroethylene (PCE), trichloroethylene (TCE), and benzene, are detected more often above corresponding regulatory or human-health benchmarks due to their higher detected concentrations in water and/or greater toxicity values. The current analysis provides useful data for evaluating the nature and extent of historical and current contamination of water supplies in New Jersey and potential opportunities for public exposures and health risks due to MTBE and other VOCs on a statewide basis. Additional forensic or forecasting analyses are required to identify the sources or timing of releases of individual contaminants at specific locations or to predict potential future water contamination in New Jersey.

Widespread Molecular Detection of Legionella pneumophila Serogroup 1 in Cold Water Taps across the United States

Donohue, M.J., Environmental Science and Technology, 48 (6), pp 3145–3152, February 2014

In the United States, 6,868 cases of legionellosis were reported to the Center for Disease Control and Prevention in 2009–2010. Of these reports, it is estimated that 84% are caused by the microorganism Legionella pneumophila Serogroup (Sg) 1. Legionella spp. have been isolated and recovered from a variety of natural freshwater environments. Human exposure to L. pneumophila Sg1 may occur from aerosolization and subsequent inhalation of household and facility water. In this study, two primer/probe sets (one able to detect L. pneumophila and the other L. pneumophila Sg1) were determined to be highly sensitive and selective for their respective targets. Over 272 water samples, collected in 2009 and 2010 from 68 public and private water taps across the United States, were analyzed using the two qPCR assays to evaluate the incidence of L. pneumophila Sg1. Nearly half of the taps showed the presence of L. pneumophila Sg1 in one sampling event, and 16% of taps were positive in more than one sampling event. This study is the first United States survey to document the occurrence and colonization of L. pneumophila Sg1 in cold water delivered from point of use taps.

Perspectives on drinking water monitoring for small scale water systems

Roig, B., Baures, E., Thomas, O., Water Science & Technology: Water Supply, Vol. 14 Issue 1, p1, January 2014

Drinking water (DW) is increasingly subject to environmental and human threats that alter the quality of the resource and potentially of the distributed water. These threats can be both biological and chemical in nature, and are often cumulated. The increase of technical frame of water quality monitoring following the evolution of water quality standards guarantee the regulation compliance in general but is not sufficient for the survey of small scale water system efficiency. The existing monitoring is not well suited to insure a good quality of distributed water, especially in the event of a sudden modification of quality. This article aims to propose alternative solutions, from the examination of monitoring practices, in a bid to limit the risk of deterioration of DW quality.

Drinking Water Microbial Myths

Martin, J.A., et al., Critical Reviews in Microbiology, November 2013

Accounts of drinking water-borne disease outbreaks have always captured the interest of the public, elected and health officials, and the media. During the twentieth century, the drinking water community and public health organizations have endeavored to craft regulations and guidelines on treatment and management practices that reduce risks from drinking water, specifically human pathogens. During this period there also evolved misunderstandings as to potential health risk associated with microorganisms that may be present in drinking waters. These misunderstanding or “myths” have led to confusion among the many stakeholders. The purpose of this article is to provide a scientific- and clinically-based discussion of these “myths” and recommendations for better ensuring the microbial safety of drinking water and valid public health decisions.

Assessing the impact of chlorinated-solvent sites on metropolitan groundwater resources

Brusseau, M.L. and Narter, M., Ground Water, November 2013

Chlorinated-solvent compounds are among the most common groundwater contaminants in the United States. A majority of the many sites contaminated by chlorinated-solvent compounds are located in metropolitan areas, and most such areas have one or more chlorinated-solvent contaminated sites. Thus, contamination of groundwater by chlorinated-solvent compounds may pose a potential risk to the sustainability of potable water supplies for many metropolitan areas. The impact of chlorinated-solvent sites on metropolitan water resources was assessed for Tucson, Arizona, by comparing the aggregate volume of extracted groundwater for all pump-and-treat systems associated with contaminated sites in the region to the total regional groundwater withdrawal. The analysis revealed that the aggregate volume of groundwater withdrawn for the pump-and-treat systems operating in Tucson, all of which are located at chlorinated-solvent contaminated sites, was 20% of the total groundwater withdrawal in the city for the study period. The treated groundwater was used primarily for direct delivery to local water supply systems or for reinjection as part of the pump-and-treat system. The volume of the treated groundwater used for potable water represented approximately 13% of the total potable water supply sourced from groundwater, and approximately 6% of the total potable water supply. This case study illustrates the significant impact chlorinated-solvent contaminated sites can have on groundwater resources and regional potable water supplies.

Radon-contaminated drinking water from private wells: an environmental health assessment examining a rural Colorado mountain community’s exposure

Cappello, M.A., et. al., Journal of Environmental Health, November 2013

In the study discussed in this article, 27 private drinking water wells located in a rural Colorado mountain community were sampled for radon contamination and compared against (a) the U.S. Environmental Protection Agency’s (U.S. EPA’s) proposed maximum contaminant level (MCL), (b) the U.S. EPA proposed alternate maximum contaminate level (AMCL), and (c) the average radon level measured in the local municipal drinking water system. The data from the authors’ study found that 100% of the wells within the study population had radon levels in excess of the U.S. EPA MCL, 37% were in excess of the U.S. EPA AMCL, and 100% of wells had radon levels greater than that found in the local municipal drinking water system. Radon contamination in one well was found to be 715 times greater than the U.S. EPA MCL, 54 times greater than the U.S. EPA AMLC, and 36,983 times greater than that found in the local municipal drinking water system. According to the research data and the reviewed literature, the results indicate that this population has a unique and elevated contamination profile and suggest that radon-contaminated drinking water from private wells can present a significant public health concern.

Microbial Health Risks of Regulated Drinking Water in the United States

Edberg, S.C., DWRF, September 2013

Drinking water regulations are designed to protect the public health. In the United States, the Environmental Protection Agency (EPA) is tasked with developing and maintaining drinking water regulations for the 276,607,387 people served by the country’s 54,293 community water systems. The Food and Drug Administration (FDA) regulates bottled water as a food product. By federal law, the FDA’s regulations for bottled water must be at least as protective of public health as the EPA’s regulations for public water system drinking water. Despite many similarities in EPA and FDA regulations, consumer perception regarding the safety of drinking waters varies widely. This paper examines and compares the microbial health risks of tap water and bottled water, specifically examining differences in quality monitoring, regulatory standards violations, advisories, and distribution system conditions. It also includes comparison data on the number of waterborne illness outbreaks caused by both tap and bottled water. Based on a review of existing research, it is clear that as a consequence of the differences in regulations, distribution systems, operating (manufacturing) practices, and microbial standards of quality, public drinking water supplies present a substantially higher human risk than do bottled waters for illness due to waterborne organisms.

The mineral content of tap water in United States households

Patterson, K.Y., et. al., Journal of Food Composition and Analysis, August 2013

The composition of tap water contributes to dietary intake of minerals. The Nutrient Data Laboratory (NDL) of the United States Department of Agriculture (USDA) conducted a study of the mineral content of residential tap water, to generate current data for the USDA National Nutrient Database. Sodium, potassium, calcium, magnesium, iron, copper, manganese, phosphorus, and zinc content of drinking water were determined in a nationally representative sampling. The statistically designed sampling method identified 144 locations for water collection in winter and spring from home taps. Assuming a daily consumption of 1 L of tap water, only four minerals (Cu, Ca, Mg, and Na), on average, provided more than 1% of the US dietary reference intake. Significant decreases in calcium were observed with chemical water softeners, and between seasonal pickups for Mg and Ca. The variance of sodium was significantly different among regions (p < 0.05) but no differences were observed as a result of collection time, water source or treatment. Based on the weighted mixed model results, there were no significant differences in overall mineral content between municipal and well water. These results, which are a nationally representative dataset of mineral values for drinking water available from home taps, provides valuable additional information for assessment of dietary mineral intake.

Quantitative analysis of microbial contamination in private drinking water supply systems

Allevi, R.P., et al., Journal of Water and Health, June 2013

Over one million households rely on private water supplies (e.g. well, spring, cistern) in the Commonwealth of Virginia, USA. The present study tested 538 private wells and springs in 20 Virginia counties for total coliforms (TCs) and Escherichia coli along with a suite of chemical contaminants. A logistic regression analysis was used to investigate potential correlations between TC contamination and chemical parameters (e.g. NO3(-), turbidity), as well as homeowner-provided survey data describing system characteristics and perceived water quality. Of the 538 samples collected, 41% (n = 221) were positive for TCs and 10% (n = 53) for E. coli. Chemical parameters were not statistically predictive of microbial contamination. Well depth, water treatment, and farm location proximate to the water supply were factors in a regression model that predicted presence/absence of TCs with 74% accuracy. Microbial and chemical source tracking techniques (Bacteroides gene Bac32F and HF183 detection via polymerase chain reaction and optical brightener detection via fluorometry) identified four samples as likely contaminated with human wastewater.

Strontium Concentrations in Corrosion Products from Residential Drinking Water Distribution Systems

Gerke, et al., Environmental Science and Technology, April 22, 2013.

The United States Environmental Protection Agency (US EPA) will require some U.S. drinking water distribution systems (DWDS) to monitor nonradioactive strontium (Sr2+) in drinking water in 2013. Iron corrosion products from four DWDS were examined to assess the potential for Sr2+ binding and release. Average Sr2+ concentrations in the outermost layer of the corrosion products ranged from 3 to 54 mg kg–1 and the Sr2+ drinking water concentrations were all ≤0.3 mg L–1. Micro-X-ray adsorption near edge structure spectroscopy and linear combination fitting determined that Sr2+ was principally associated with CaCO3. Sr2+ was also detected as a surface complex associated with α-FeOOH. Iron particulates deposited on a filter inside a home had an average Sr2+ concentration of 40.3 mg kg–1 and the associated drinking water at a tap was 210 μg L–1. The data suggest that elevated Sr2+ concentrations may be associated with iron corrosion products that, if disturbed, could increase Sr2+ concentrations above the 0.3 μg L–1 US EPA reporting threshold. Disassociation of very small particulates could result in drinking water Sr2+ concentrations that exceed the US EPA health reference limit (4.20 mg kg–1 body weight).

Evaluating violations of drinking water regulations

Rubin, S.J., Journal, American Water Works Association, March 2013

US Environmental Protection Agency data were analyzed for violations by community water systems (CWSs). Several characteristics were evaluated, including size, source water, and violation type. The data show that: (1) 55% of CWSs violated at least one regulation under the Safe Drinking Water Act that involved systems serving more than 95 million people; (2) the presence of violations was no different for groundwater and surface water systems; (3) fewer than 20% of CWSs with violations exceeded an allowable level of a contaminant in drinking water; (4) smaller water systems are no more likely than larger systems, except very large systems, to violate health-related requirements; and (5) smaller CWSs appear more likely than larger systems to violate monitoring, reporting, and notification requirements. An evaluation was also conducted of four contaminants that had health-related violations by more than 1% of CWSs: total coliform, stage 1 disinfection by-products, arsenic, and lead and copper.

Lead (Pb) quantification in potable water samples: implications for regulatory compliance and assessment of human exposure

Triantafyllidou, S., et al., Environmental Monitoring and Assessment, February 2013

Assessing the health risk from lead (Pb) in potable water requires accurate quantification of the Pb concentration. Under worst-case scenarios of highly contaminated water samples, representative of public health concerns, up to 71-98 % of the total Pb was not quantified if water samples were not mixed thoroughly after standard preservation (i.e., addition of 0.15 % (v/v) HNO(3)). Thorough mixing after standard preservation improved recovery in all samples, but 35-81 % of the total Pb was still un-quantified in some samples. Transfer of samples from one bottle to another also created high errors (40-100 % of the total Pb was un-quantified in transferred samples). Although the United States Environmental Protection Agency’s standard protocol avoids most of these errors, certain methods considered EPA-equivalent allow these errors for regulatory compliance sampling. Moreover, routine monitoring for assessment of human Pb exposure in the USA has no standardized protocols for water sample handling and pre-treatment. Overall, while there is no reason to believe that sample handling and pre-treatment dramatically skew regulatory compliance with the US Pb action level, slight variations from one approved protocol to another may cause Pb-in-water health risks to be significantly underestimated, especially for unusual situations of “worst case” individual exposure to highly contaminated water.

The need for congressional action to finance arsenic reductions in drinking water

Levine, R.L., Journal of Environmental Health, November 2012

Many public water systems in the U.S. are unsafe because the communities cannot afford to comply with the current 10 parts per billion (ppb) federal arsenic standard for drinking water. Communities unable to afford improvements remain vulnerable to adverse health effects associated with higher levels of arsenic exposure. Scientific and bipartisan political consensus exists that the arsenic standard should not be less stringent than 10 ppb, and new data suggest additional adverse health effects related to arsenic exposure through drinking water. Congress has failed to reauthorize the Drinking Water State Revolving Fund program to provide reliable funding to promote compliance and reduce the risk of adverse health effects. Congress’s recent ad hoc appropriations do not allow long-term planning and ongoing monitoring and maintenance. Investing in water infrastructure will lower health care costs and create American jobs. Delaying necessary upgrades will only increase the costs of improvements over time.

Direct healthcare costs of selected diseases primarily or partially transmitted by water

Collier, S.A., et al., Epidemiology and Infection, November 2012

Despite US sanitation advancements, millions of waterborne disease cases occur annually, although the precise burden of disease is not well quantified. Estimating the direct healthcare cost of specific infections would be useful in prioritizing waterborne disease prevention activities. Hospitalization and outpatient visit costs per case and total US hospitalization costs for ten waterborne diseases were calculated using large healthcare claims and hospital discharge databases. The five primarily waterborne diseases in this analysis (giardiasis, cryptosporidiosis, Legionnaires’ disease, otitis externa, and non-tuberculous mycobacterial infection) were responsible for over 40 000 hospitalizations at a cost of $970 million per year, including at least $430 million in hospitalization costs for Medicaid and Medicare patients. An additional 50 000 hospitalizations for campylobacteriosis, salmonellosis, shigellosis, haemolytic uraemic syndrome, and toxoplasmosis cost $860 million annually ($390 million in payments for Medicaid and Medicare patients), a portion of which can be assumed to be due to waterborne transmission.

Arcobacter in Lake Erie Beach Waters: an Emerging Gastrointestinal Pathogen Linked with Human-Associated Fecal Contamination

Lee, C., et al., Applied and Environmental Microbiology, September 2012

The genus Arcobacter has been associated with human illness and fecal contamination by humans and animals. To better characterize the health risk posed by this emerging waterborne pathogen, we investigated the occurrence of Arcobacter spp. in Lake Erie beach waters. During the summer of 2010, water samples were collected 35 times from the Euclid, Villa Angela, and Headlands (East and West) beaches, located along Ohio’s Lake Erie coast. After sample concentration, Arcobacter was quantified by real-time PCR targeting the Arcobacter 23S rRNA gene. Other fecal genetic markers (Bacteroides 16S rRNA gene [HuBac], Escherichia coli uidA gene, Enterococcus 23S rRNA gene, and tetracycline resistance genes) were also assessed. Arcobacter was detected frequently at all beaches, and both the occurrence and densities of Arcobacter spp. were higher at the Euclid and Villa Angela beaches (with higher levels of fecal contamination) than at the East and West Headlands beaches. The Arcobacter density in Lake Erie beach water was significantly correlated with the human-specific fecal marker HuBac according to Spearman’s correlation analysis (r = 0.592; P < 0.001). Phylogenetic analysis demonstrated that most of the identified Arcobacter sequences were closely related to Arcobacter cryaerophilus, which is known to cause gastrointestinal diseases in humans. Since human-pathogenic Arcobacter spp. are linked to human-associated fecal sources, it is important to identify and manage the human-associated contamination sources for the prevention of Arcobacter-associated public health risks at Lake Erie beaches.

The Quality of Drinking Water in North Carolina Farmworker Camps

Bischoff, W.E., MD, PhD, et al.American Journal of Public Health, August 2012

The purpose of this study was to assess water quality in migrant farmworker camps in North Carolina and determine associations of water quality with migrant farmworker housing characteristics. Researchers collected data from 181 farmworker camps in eastern North Carolina during the 2010 agricultural season. Water samples were tested using the Total Coliform Rule (TCR) and housing characteristics were assessed using North Carolina Department of Labor standards. A total of 61 (34%) of 181 camps failed the TCR. Total coliform bacteria were found in all 61 camps, with Escherichia coli also being detected in 2. Water quality was not associated with farmworker housing characteristics or with access to registered public water supplies. Multiple official violations of water quality standards had been reported for the registered public water supplies. They concluded that water supplied to farmworker camps often does not comply with current standards and poses a great risk to the physical health of farmworkers and surrounding communities. Expansion of water monitoring to more camps and changes to the regulations such as testing during occupancy and stronger enforcement are needed to secure water safety.

Chemical mixtures in untreated water from public-supply wells in the U.S. — Occurrence, composition, and potential toxicity

Toccalino, P.L., Norman, J.E., Scott, J.C., Science of The Total Environment, August 2012

Chemical mixtures are prevalent in groundwater used for public water supply, but little is known about their potential health effects. As part of a large-scale ambient groundwater study, we evaluated chemical mixtures across multiple chemical classes, and included more chemical contaminants than in previous studies of mixtures in public-supply wells. We (1) assessed the occurrence of chemical mixtures in untreated source-water samples from public-supply wells, (2) determined the composition of the most frequently occurring mixtures, and (3) characterized the potential toxicity of mixtures using a new screening approach. The U.S. Geological Survey collected one untreated water sample from each of 383 public wells distributed across 35 states, and analyzed the samples for as many as 91 chemical contaminants. Concentrations of mixture components were compared to individual human-health benchmarks; the potential toxicity of mixtures was characterized by addition of benchmark-normalized component concentrations. Most samples (84%) contained mixtures of two or more contaminants, each at concentrations greater than one-tenth of individual benchmarks. The chemical mixtures that most frequently occurred and had the greatest potential toxicity primarily were composed of trace elements (including arsenic, strontium, or uranium), radon, or nitrate. Herbicides, disinfection by-products, and solvents were the most common organic contaminants in mixtures. The sum of benchmark-normalized concentrations was greater than 1 for 58% of samples, suggesting that there could be potential for mixtures toxicity in more than half of the public-well samples. Our findings can be used to help set priorities for groundwater monitoring and suggest future research directions for drinking-water treatment studies and for toxicity assessments of chemical mixtures in water resources.

Risk of Viral Acute Gastrointestinal Illness from Nondisinfected Drinking Water Distribution Systems

Lambertini, E., et al., Environmental Science and Technology, July 2012

Acute gastrointestinal illness (AGI) resulting from pathogens directly entering the piping of drinking water distribution systems is insufficiently understood. Here, we estimate AGI incidence from virus intrusions into the distribution systems of 14 nondisinfecting, groundwater-source, community water systems. Water samples for virus quantification were collected monthly at wells and households during four 12-week periods in 2006–2007. Ultraviolet (UV) disinfection was installed on the communities’ wellheads during one study year; UV was absent the other year. UV was intended to eliminate virus contributions from the wells and without residual disinfectant present in these systems, any increase in virus concentration downstream at household taps represented virus contributions from the distribution system (Approach 1). During no-UV periods, distribution system viruses were estimated by the difference between well water and household tap virus concentrations (Approach 2). For both approaches, a Monte Carlo risk assessment framework was used to estimate AGI risk from distribution systems using study-specific exposure–response relationships. Depending on the exposure–response relationship selected, AGI risk from the distribution systems was 0.0180–0.0661 and 0.001–0.1047 episodes/person-year estimated by Approaches 1 and 2, respectively. These values represented 0.1–4.9% of AGI risk from all exposure routes, and 1.6–67.8% of risk related to drinking water exposure. Virus intrusions into nondisinfected drinking water distribution systems can contribute to sporadic AGI.

Methodological Aspects of Fluid Intake Records and Surveys

Vergne, S. PhD, Nutrition Today, July/August 2012

Assessing the fluid intake level of different populations has, to date, attracted very little interest. The comparison of existing data based on food surveys reveals notable differences between countries and within different surveys in 1 country. Methodological issues seem to account to a large extent for these differences. Recent studies conducted using specifically designed diaries to record fluid and water intake over a 7-day period tend to give more accurate results. These recent studies could potentially lead to the revision of the values of adequate intakes of water in numerous countries.


Screening-Level Risk Assessment of Coxiella burnetii (Q Fever) Transmission via Aeration of Drinking Water

Sales-Ortells, H., Medema, G., Environmental Science and Technology, April 2012

A screening-level risk assessment of Q fever transmission through drinking water produced from groundwater in the vicinity of infected goat barnyards that employed aeration of the water was performed. Quantitative data from scientific literature were collected and a Quantitative Microbial Risk Assessment approach was followed. An exposure model was developed to calculate the dose to which consumers of aerated groundwater are exposed through aerosols inhalation during showering. The exposure assessment and hazard characterization were integrated in a screening-level risk characterization using a dose-response model for inhalation to determine the risk of Q fever through tap water. A nominal range sensitivity analysis was performed. The estimated risk of disease was lower than 10(-4) per person per year (pppy), hence the risk of transmission of C. burnetii through inhalation of drinking water aerosols is very low. The sensitivity analysis shows that the most uncertain parameters are the aeration process, the transport of C. burnetii in bioaerosols via the air, the aerosolization of C. burnetii in the shower, and the air filtration efficiency. The risk was compared to direct airborne exposure of persons in the vicinity of infected goat farms; the relative risk of exposure through inhalation of drinking water aerosols was 0.002%.

Waterborne Pathogens: Emerging Issues in Monitoring, Treatment and Control

Reynolds, K.A., MSPH, Ph.D., Water Conditioning & Purification, March 2012

Microbial threats to water quality continue to emerge; however, technologies for monitoring, treating and controlling emerging waterborne pathogens are also evolving. Understanding the range of factors that lead to the contamination of water are important for developing appropriate tools to manage human health risks.

Health Risks of Limited-Contact Water Recreation

Dorevitch, S., et al., Environmental Health Perspectives, February 2012

Wastewater-impacted waters that do not support swimming are often used for boating, canoeing, fishing, kayaking, and rowing. Little is known about the health risks of these limited-contact water recreation activities. We evaluated the incidence of illness, severity of illness, associations between water exposure and illness, and risk of illness attributable to limited-contact water recreation on waters dominated by wastewater effluent and on waters approved for general use recreation (such as swimming). The Chicago Health, Environmental Exposure, and Recreation Study was a prospective cohort study that evaluated five health outcomes among three groups of people: those who engaged in limited-contact water recreation on effluent-dominated waters, those who engaged in limited-contact recreation on general-use waters, and those who engaged in non–water recreation. Data analysis included survival analysis, logistic regression, and estimates of risk for counterfactual exposure scenarios using G-computation. Telephone follow-up data were available for 11,297 participants. With non–water recreation as the reference group, we found that limited-contact water recreation was associated with the development of acute gastrointestinal illness in the first 3 days after water recreation at both effluent-dominated waters [adjusted odds ratio (AOR) 1.46; 95% confidence interval (CI): 1.08, 1.96] and general-use waters (1.50; 95% CI: 1.09, 2.07). For every 1,000 recreators, 13.7 (95% CI: 3.1, 24.9) and 15.1 (95% CI: 2.6, 25.7) cases of gastrointestinal illness were attributable to limited-contact recreation at effluent-dominated waters and general-use waters, respectively. Eye symptoms were associated with use of effluent-dominated waters only (AOR 1.50; 95% CI: 1.10, 2.06). Among water recreators, our results indicate that illness was associated with the amount of water exposure. Limited-contact recreation, both on effluent-dominated waters and on waters designated for general use, was associated with an elevated risk of gastrointestinal illness.

Planning for Sustainability: A Handbook for Water and Wastewater Utilities

U.S. Environmental Protection Agency, February 2012

This handbook is intended to provide information about how to enhance current planning processes by building in sustainability considerations. It is designed to be useful for various types and scales of planning efforts, such as: Long-range integrated water resource planning, Strategic planning, Capital planning, System-wide planning to meet regulatory requirements (e.g., combined sewer overflow upgrades and new stormwater permitting requirements), Specific infrastructure project planning (e.g., for repair, rehabilitation, or replacement of specific infrastructure)

Atrazine Exposure in Public Drinking Water and Preterm Birth

Rinsky, J.L., et al., Public Health Reports, January/February 2012

Approximately 13% of all births occur prior to 37 weeks gestation in the U.S. Some established risk factors exist for preterm birth, but the etiology remains largely unknown. Recent studies have suggested an association with environmental exposures. We examined the relationship between preterm birth and exposure to a commonly used herbicide, atrazine, in drinking water. We reviewed Kentucky birth certificate data for 2004-2006 to collect duration of pregnancy and other individual-level covariates. We assessed existing data sources for atrazine levels in public drinking water for the years 2000-2008, classifying maternal county of residence into three atrazine exposure groups. We used logistic regression to analyze the relationship between atrazine exposure and preterm birth, controlling for maternal age, race/ethnicity, education, smoking, and prenatal care. An increase in the odds of preterm birth was found for women residing in the counties included in the highest atrazine exposure group compared with women residing in counties in the lowest exposure group, while controlling for covariates. Analyses using the three exposure assessment approaches produced odds ratios ranging from 1.20 (95% confidence interval [CI] 1.14, 1.27) to 1.26 (95% CI 1.19, 1.32), for the highest compared with the lowest exposure group. Suboptimal characterization of environmental exposure and variables of interest limited the analytical options of this study. Still, our findings suggest a positive association between atrazine and preterm birth, and illustrate the need for an improved assessment of environmental exposures to accurately address this important public health issue.

Source Water Protection Vision and Roadmap

Water Research Foundation, January 2012

In 2007, a group of source water protection experts met, under the auspices of the Water Research Foundation and the Water Environment Research Foundation, to develop a research agenda that would ultimately provide information to help drinking water suppliers design and implement effective source water protection programs. A key result of that effort identified the need for a national vision and roadmap that would guide U.S. water utilities and supporting groups with a unified strategy for coherent, consistent, cost-effective, and socially-acceptable source water protection programs. This brief document presents the vision and roadmap and focuses on how to move forward on source water protection. The roadmap is intended to serve as a feasible, focused path toward promoting source water protection for U.S. drinking water utilities. It is not intended to serve as an official directive, but rather is a collection of observations and recommendations organized to form a path to achieving the vision. The companion document Developing a Vision and Roadmap for Drinking Water Source Protection comprehensively covers the project team’s findings regarding the various building blocks to make source water protection a reality. That document includes an annotated bibliography of source water protection resources, a summation of a literature review, and helpful water utility case studies. Both documents are meant to be used in concert to help water utilities move forward with their source water protection efforts and proactively improve and/or maintain the quality of their drinking water sources. Source water protection has been discussed and promoted in an ad hoc fashion by different organizations at the national, regional, state, and local levels. It is essential to increase the awareness of source water protection at the national level. Education of decision makers, utility managers, stakeholders, and the general public should be the first step in moving source water protection up a path to success. Leadership is needed to make this a national priority. In order to ensure the various actions recommended in the roadmap can be carried out, it is recommended that both a top-down and a bottom-up approach be taken. A top-down approach would establish a flexible framework to guide local entities (e.g., water systems, watershed organizations, and regional planning agencies) to work together to protect source water. Due to the variability of source waters and the areas from which they are derived, along with technical, social, political, financial, and regulatory differences across jurisdictions, it is unlikely that two source water protection programs would be the same. A bottom-up approach is therefore also needed, which would use local information and broad stakeholder involvement to produce a “tailored” source water protection program that addresses unique issues at the local level.

Migration of Bisphenol-A into the Natural Spring Water Packaged in Polycarbonate Carboys

Erdem, Y.K., Furkan, A., International Journal of Applied Science and Technology, January 2012

Bisphenol-A is a widely used chemical in the structure of epoxy resins, polycarbonate packages, lacquer of metal food packages all over the world. Its weak estrogenic character and possible health effects are well known. For this reason the usage of the Bisphenol-A in food packages is limited and it’s daily intake by human is restrictly under control. The declaration of specific migration limit is 0.6 ppm, the tolerable daily intake is 0.05mg/kg body weight per day by EFSA and other authorities. The EFSA and others prevent the manufacturing and using of Bisphenol-A in baby bottles in 2010. In Turkey, the 70% of the population are living in 5 metropolitan cities and the drinking water consumption is mostly supplied by packaged drinking water industry. The household and bulk usage is covered by natural spring and natural mineral water packaged in 19 liters polycarbonate carboys. That’s why the possible migration of Bisphenol-A in drinking water packaged in polycarbonate carboys was decided to investigate. First of all, a screening test was carried out in the samples supplied by two main cities. And then 5 different trade mark packaged water samples was stored at 4, 25, and 35oC for 60 days and Bisphenol-A content was determined in given intervals. It is found that the BPA migration was detected at least 450 times lower than the specific migration limit of EFSA during 60 days storage at these conditions.

Bottled Water & Tap Water: Just the Facts

Drinking Water Research Foundation, October 2011

The information presented in this report supports the fact that drinking water, whether from the tap or a bottle, is generally safe, and that regulatory requirements for both tap water and bottled water provide Americans with clean, safe drinking water. There are some differences in regulations for each, but those differences highlight the differences between drinking water delivered by a public water system and drinking water delivered to the consumer in a sealed container. Perhaps the most notable difference between tap water and bottled water is the method of delivery. Community water systems deliver water to consumers (businesses and private residences) through miles of underground iron (unlined and poly-lined), PVC, and lead service lines that can be subject to leakage with age of the system and accidental failures, resulting in the risk of post-treatment contamination of the water that is delivered to consumers. Bottled water is delivered to consumers in sanitary, sealed containers that were filled in a bottling facility under controlled conditions in a fill room.

Bromate reduction in simulated gastric juice

Cotruvo, J.A., et al., e-Journal AWWA, November 2010

This article advocates for a revised risk assessment for bromate to reflect presystemic chemistry not usually considered when low-dose risks are calculated from high-dose toxicology data. Because of high acidity and the presence of reducing agents, presystemic decomposition of bromate can begin in the stomach, which should contribute to lower-than- expected doses to target organs. In this research, bromate decomposition kinetics with simulated stomach/gastric juice were studied to determine the risk of environmentally relevant exposure to bromate. The current work is the first step in a series of studies that the authors are conducting to better estimate the hypothetical low-dose risks to humans from drinking water ingestion and thus arrive at more appropriate maximum contaminant levels (MCLs). It is the authors’ belief that additional kinetics and metabolism research will demonstrate that the human risk from ingestion of compounds in drinking water is less than originally believed and will lead to MCLs and MCL goals that are more scientifically based.

Drinking Water and Risk of Stroke

Gustavo Saposnik, MD, MSc, FAHA, Stroke, October 2010

In the present issue of Stroke, the authors investigate the association between low-level arsenic exposure in drinking water and the ischemic stroke admissions in Michigan. They found that even low exposure to arsenic is associated with an increased incident risk of stroke (relative risk, 1.03; 95% CI, 1.01 to 1.05 per µg/L increase in arsenic concentration). The authors also compared whether that exposure was associated with other nonvascular conditions (hernia, duodenal ulcer) not expected to increase their risk. Comparing zip codes in Genesee County at the 90th percentile of arsenic levels (21.6 µg/L) with those at the 10th percentile (0.30 µg/L), there was a 91% increase in risk of stroke admission (relative risk, 1.91; 95% CI, 1.27 to 2.88). The results were consistent in showing an increased risk for stroke, but not for other control medical conditions (hernia and duodenal ulcer). Moreover, they found a graded effect: a higher incident risk among those individuals exposed to higher water concentrations of arsenic (Figure 2).

Association between children’s blood lead levels, lead service lines, and water disinfection

Brown, M.J., Raymond, J., Homa, D., Kennedy, C., Sinks, T., Environmental Research, October 2010

Evaluate the effect of changes in the water disinfection process, and presence of lead service lines (LSLs), on children’s blood lead levels (BLLs) in Washington, DC. Three cross-sectional analyses examined the relationship of LSL and changes in water disinfectant with BLLs in children o6 years of age. The study population was derived from the DC Childhood Lead Poisoning Prevention Program blood lead surveillance system of children who were tested and whose blood lead test results were reported to the DC Health Department. The Washington, DC Water and Sewer Authority (WASA) provided information on LSLs. The final study population consisted of 63,854 children with validated addresses. Controlling for age of housing, LSL was an independent risk factor for BLLs Z10 mg/dL, and Z5 mg/dL even during time periods whenwater levelsmet theUS Environmental Protection Agency (EPA) action level of 15 parts per billion (ppb). When chloramine alone was used to disinfect water, the risk for BLL in the highest quartile among children in homes with LSL was greater than when either chlorine or chloramine with orthophosphate was used. For children tested after LSLs in their houses were replaced, those with partially replaced LSL were 43 times as likely to have BLLs Z10 mg/dL versus children who never had LSLs. LSLs were a risk factor for elevated BLLs even when WASA met the EPA water action level. Changes in water disinfection can enhance the effect of LSLs and increase lead exposure. Partially replacing LSLs may not decrease the risk of elevated BLLs associated with LSL exposure.

When is the Next Boil Water Alert?

Water Technology, August 2010

A common theme we see on a daily basis relates to drinking water infrastructure. We track news throughout the world that impacts the drinking water industry, and one of the most frequent things we see are notices from agencies and organizations about the need for communities to boil water in order to combat possible contamination. In some parts of the world, boiling water is the norm due to water supply issues. Often, these areas may be limited in their ability to develop economically, as clean water is such an integral part of daily life. It is in the developed world, however, where we have been seeing a large increase in the number of such notices.

Climate Change, Water, and Risk: Current Water Demands Are Not Sustainable

www.nrdc.org, July 2010

Climate change will have a significant impact on the sustainability of water supplies in the coming decades. A new analysis, performed by consulting firm Tetra Tech for the Natural Resources Defense Council (NRDC), examined the effects of global warming on water supply and demand in the contiguous United States. The study found that more than 1,100 counties— one-third of all counties in the lower 48—will face higher risks of water shortages by mid-century as the result of global warming. More than 400 of these counties will face extremely high risks of water shortages.

Water Disinfection By-Products and the Risk of Specific Birth Defects: A Population-Based Cross-Sectional Study in Taiwan

Hwang, B.-F., Jaakkola, J., Guo, H.-R., Environmental Health,  June 2008

Recent findings suggest that exposure to disinfection by-products may increase the risk of birth defects. Previous studies have focused mainly on birth defects in general or groups of defects. The objective of the present study was to assess the effect of water disinfection by-products on the risk of most common specific birth defects. We conducted a population-based cross-sectional study of 396,049 Taiwanese births in 2001-2003 using information from the Birth Registry and Waterworks Registry. We compared the risk of eleven most common specific defects in four disinfection by-product exposure categories based on the levels of total trihalomethanes (TTHMs) representing high (TTHMs 20+ ug/L), medium (TTHMs 10-19 ug/L), low exposure (TTHMs 5-9 ug/L), and 0-4 ug/L as the reference category. In addition, we conducted a meta-analysis of the results from the present and previous studies focusing on the same birth defects.

Maternal Exposure to Water Disinfection By-products During Gestation and Risk of Hypospadias

Luben, T.J., Nuckols, J.R., Mosley, B.S., Hobbs, C., Reif, J.S., Occupational and Environmental Medicine, June 2008

The use of chlorine for water disinfection results in the formation of numerous contaminants called disinfection by-products (DBPs), which may be associated with birth defects, including urinary tract defects. We used Arkansas birth records (1998-2002) to conduct a population-based case-control study investigating the relationship between hypospadias and two classes of DBPs, trihalomethanes (THM) and haloacetic acids (HAA). We utilised monitoring data, spline regression and geographical information systems (GIS) to link daily concentrations of these DBPs from 263 water utilities to 320 cases and 614 controls. We calculated ORs for hypospadias and exposure to DBPs between 6 and 16 weeks’ gestation, and conducted subset analyses for exposure from ingestion, and metrics incorporating consumption, showering and bathing. We found no increase in risk when women in the highest tertiles of exposure were compared to those in the lowest for any DBP. When ingestion alone was used to assess exposure among a subset of 40 cases and 243 controls, the intermediate tertiles of exposure to total THM and the five most common HAA had ORs of 2.11 (95% CI 0.89 to 5.00) and 2.45 (95% CI 1.06 to 5.67), respectively, compared to women with no exposure. When exposure to total THM from consumption, showering and bathing exposures was evaluated, we found an OR of 1.96 (95% CI 0.65 to 6.42) for the highest tertile of exposure and weak evidence of a dose-response relationship. Our results provide little evidence for a positive relationship between DBP exposure during gestation and an increased risk of hypospadias but emphasize the necessity of including individual-level data when assessing exposure to DBPs.

Formation of N-Nitrosamines from Eleven Disinfection Treatments of Seven Different Surface Waters

Zhao, Y.-Y., et al., Environmental Science & Technology, May 2008

Formation of nine N-nitrosamines has been investigated when seven different source waters representing various qualities were each treated with eleven bench-scale disinfection processes, without addition of nitrosamine precursors. These disinfection treatments included chlorine (OCl-) chloramine (NH2Cl), chlorine dioxide (ClO2), ozone (O3), ultraviolet (UV), advanced oxidation processes (AOP), and combinations. The total organic carbon (TOC) of the seven source waters ranged from 2 to 24 mg L-1. The disinfected water samples and the untreated source waters were analyzed for nine nitrosamines using a solid phase extraction and liquid chromatography-tandem mass spectrometry method. Prior to any treatment, N-nitrosodimethylamine (NDMA) was detected ranging from 0 to 53 ng L-1 in six of the seven source waters, and its concentrations increased in the disinfected water samples (0 – 118 ng L-1). N-nitrosodiethylamine (NDEA), N-nitrosomorpholine (NMor), and N-nitrosodiphenylamine (NDPhA) were also identified in some of the disinfected water samples. NDPhA (0.2- 0.6 ng L-1) was formed after disinfection with OCl-, NH2Cl, O3, and MPUV/OCl-. NMEA was produced with OCl- and MPUV/OCl-, and NMor formation was associated with O3. In addition, UV treatment alone degraded NDMA; however, UV/OCl- and AOP/OCl- treatments produced higher amounts of NDMA compared to UV and AOP alone, respectively. These results suggest that UV degradation or AOP oxidation treatment may provide a source of NDMA precursors. This study demonstrates that environmental concentrations and mixtures of unknown nitrosamine precursors in source waters can form NDMA and other nitrosamines.

N,N-Dimethylsulfamide as Precursor for N-Nitrosodimethylamine (NDMA) Formation upon Ozonation and its Fate During Drinking Water Treatment

Schmidt, C.K., Brauch, H.-J., Environmental Science & Technology, April 2008

Application and microbial degradation of the fungicide tolylfluanide gives rise to a new decomposition product named N,N-dimethylsulfamide (DMS). In Germany, DMS was found in groundwaters and surface waters with typical concentrations in the range of 100-1000 ng/L and 50-90 ng/L, respectively. Laboratory-scale and field investigations concerning its fate during drinking water treatment showed that DMS cannot be removed via riverbank filtration, activated carbon filtration, flocculation, and oxidation or disinfection procedures based on hydrogen peroxide, potassium permanganate, chlorine dioxide, or UV irradiation. Even nanofiltration does not provide a sufficient removal efficiency. During ozonation about 30-50% of DMS are converted to the carcinogenic N-nitrosodimethylamine (NDMA). The NDMA being formed is biodegradable and can at least partially be removed by subsequent biologically active drinking water treatment steps including sand or activated carbon filtration. Disinfection with hypochlorous acid converts DMS to so far unknown degradation products but not to NDMA or 1,1-dimethylhydrazine (UDMH).


Risk of Birth Defects in Australian Communities with High Brominated Disinfection By-product Levels

Chisholm, K., et al., Environmental Health Perspective, April 2008

By international standards, water supplies in Perth, Western Australia, contain high trihalomethane (THM) levels, particularly the brominated forms. Geographic variability in these levels provided an opportunity to examine cross-city spatial relationships between THM exposure and rates of birth defects (BDs).Our goal was to examine BD rates by exposure to THMs with a highly brominated fraction in metropolitan locations in Perth, Western Australia. We collected water samples from 47 separate locations and analyzed them for total and individual THM concentrations (micrograms per liter), including separation into brominated forms. We classified collection areas by total THM (TTHM) concentration: low (< 60 microg/L), medium (> 60 to < 130 microg/L), and high (> or = 130 microg/L). We also obtained deidentified registry-based data on total births and BDs (2000-2004 inclusive) from post codes corresponding to water sample collection sites and used binomial logistic regression to compare the frequency of BDs aggregately and separately for the TTHM exposure groups, adjusting for maternal age and socioeconomic status. Total THMs ranged from 36 to 190 microg/L. A high proportion of the THMs were brominated (on average, 92%). Women living in high-TTHM areas showed an increased risk of any BD [odds ratio (OR) = 1.22; 95% confidence interval (CI), 1.01-1.48] and for the major category of any cardiovascular BD (OR = 1.62; 95% CI, 1.04-2.51), compared with women living in low-TTHM areas. Brominated forms constituted the significant fraction of THMs in all areas. Small but statistically significant increases in risks of BDs were associated with residence in areas with high THMs.

EPA – FACTOIDS: Drinking Water and Ground Water Statistics for 2007

U.S. Environmental Protection Agency, March, 2008

There are approximately 156,000 public drinking water systems in the United States. Each of these systems regularly supplies drinking water to at least 25 people or 15 service connections. Beyond their common purpose, the 156,000 systems vary widely. The following tables group water systems into categories that show their similarities and differences. For example, the first table shows that most people in the US (286 million) get their water from a community water system. There are approximately 52,000 community water systems, but just eight percent of those systems (4,048) serve 82 percent of the people. The second table shows that more water systems have groundwater than surface water as a source–but more people drink from a surface water system. Other tables break down these national numbers by state, territory, and EPA region.

This package also contains figures on the types and locations of underground injection control wells. EPA and states regulate the placement and operation of these wells to ensure that they do not threaten underground sources of drinking water. The underground injection control program statistics are based on separate reporting from the states to EPA. The drinking water system statistics on the following pages are taken from the Safe Drinking Water Information System/Federal version (SDWIS/Fed). SDWIS/Fed is the U.S. Environmental Protection Agency’s official record of public drinking water systems, their violations of state and EPA regulations, and enforcement actions taken by EPA or states as a result of those violations. EPA maintains the database using information collected and submitted by the states. Notice: Compliance statistics are based on violations reported by states to the EPA Safe Drinking Water Information System. EPA is aware of inaccuracies and underreporting of some data in this system. We are working with the states to improve the quality of the data. Read an analysis of SDWIS/Fed data quality and get more information and additional drinking water data tables.

Human Health Risk Assessment of Chlorinated Disinfection By-products in Drinking Water Using a Probabilistic Approach

Hamidin, N., Yu, Q.J., Connell, D.W., Water Research, March 2008

The presence of chlorinated disinfection by-products (DBPs) in drinking water is a public health issue, due to their possible adverse health effects on humans. To gauge the risk of chlorinated DBPs on human health, a risk assessment of chloroform (trichloromethane (TCM)), bromodichloromethane (BDCM), dibromochloromethane (DBCM), bromoform (tribromomethane (TBM)), dichloroacetic acid (DCAA) and trichloroacetic acid (TCAA) in drinking water was carried out using probabilistic techniques. Literature data on exposure concentrations from more than 15 different countries and adverse health effects on test animals as well as human epidemiological studies were used. The risk assessment showed no overlap between the highest human exposure dose (EXP(D)) and the lowest human equivalent dose (HED) from animal test data, for TCM, BDCM, DBCM, TBM, DCAA and TCAA. All the HED values were approximately 10(4)-10(5) times higher than the 95th percentiles of EXP(D). However, from the human epidemiology data, there was a positive overlap between the highest EXP(D) and the lifetime average daily doses (LADD(H)) for TCM, BDCM, DCAA and TCAA. This suggests that there are possible adverse health risks such as a small increased incidence of cancers in males and developmental effects on infants. However, the epidemiological data comprised several risk factors and exposure classification levels which may affect the overall results.

Drinking Water Disinfection By-Products and Time to Pregnancy

Maclehose, R.F., Savitz, D.A., Herring, A.H., Hartmann, K.E., Singer, P.C., Weinberg, H.S., Epidemiology, March 2008

Laboratory evidence suggests tap water disinfection by-products (DBPs) could have an effect very early in pregnancy, typically before clinical detectability. Undetected early losses would be expected to increase the reported number of cycles to clinical pregnancy. We investigated the association between specific DBPs (trihalomethanes, haloacetic acids, brominated-trihalomethanes, brominated-haloacetic acids, total organic halides, and bromodichloromethane) and time to pregnancy among women who enrolled in a study of drinking water and reproductive outcomes. We quantified exposure to DBPs through concentrations in tap water, quantity ingested through drinking, quantity inhaled or absorbed while showering or bathing, and total integrated exposure. The effect of DBPs on time to pregnancy was estimated using a discrete time hazard model. Overall, we found no evidence of an increased time to pregnancy among women who were exposed to higher levels of DBPs. A modestly decreased time to pregnancy (ie, increased fecundability) was seen among those exposed to the highest level of ingested DBPs, but not for tap water concentration, the amount absorbed while showering or bathing, or the integrated exposure. Our findings extend those of a recently published study suggesting a lack of association between DBPs and pregnancy loss.

Risk of waterborne illness via drinking water in the United States

Reynolds, K.A., Mena, K.D., Gerba, C.P., Reviews of Environmental Contamination & Toxicology, January 2008

Outbreaks of disease attributable to drinking water are not common in the U.S., but they do still occur and can lead to serious acute, chronic, or sometimes fatal health consequences, particularly in sensitive and immunocompromised populations. From 1971 to 2002, there were 764 documented waterborne outbreaks associated with drinking water, resulting in 575,457 cases of illness and 79 deaths (Blackburn et al. 2004; Calderon 2004); however, the true impact of disease is estimated to be much higher. If properly applied, current protocols in municipal water treatment are effective at eliminating pathogens from water. However, inadequate, interrupted, or intermittent treatment has repeatedly been associated with waterborne disease outbreaks. Contamination is not evenly distributed but rather affected by the number of pathogens in the source water, the age of the distribution system, the quality of the delivered water, and climatic events that can tax treatment plant operations. Private water supplies are not regulated by the USEPA and are generally not treated or monitored, although very few of the municipal systems involved in documented outbreaks exceeded the USEPA’s total coliform standard in the preceding 12 mon (Craun et al. 2002). We provide here estimates of waterborne infection and illness risks in the U.S. based on the total number of water systems, source water type, and total populations exposed. Furthermore, we evaluated all possible illnesses associated with the microbial infection and not just gastroenteritis. Our results indicate that 10.7 M infections/yr and 5.4 M illnesses/yr occur in populations served by community groundwater systems; 2.2 M infections/yr and 1.1 M illnesses/yr occur in noncommunity groundwater systems; and 26.0 M infections/yr and 13.0 M illnesses/yr occur in municipal surface water systems. The total estimated number of waterborne illnesses/yr in the U.S. is therefore estimated to be 19.5 M/yr. Others have recently estimated waterborne illness rates of 12M cases/yr (Colford et al. 2006) and 16 M cases/yr (Messner et al. 2006), yet our estimate considers all health outcomes associated with exposure to pathogens in drinking water rather than only gastrointestinal illness. Drinking water outbreaks exemplify known breaches in municipal water treatment and distribution processes and the failure of regulatory requirements to ensure water that is free of human pathogens. Water purification technologies applied at the point-of-use (POU) can be effective for limiting the effects of source water contamination, treatment plant inadequacies, minor intrusions in the distribution system, or deliberate posttreatment acts (i.e., bioterrorism). Epidemiological studies are conflicting on the benefits of POU water treatment. One prospective intervention study found that consumers of reverse-osmosis (POU) filtered water had 20%-35% less gastrointestinal illnesses than those consuming regular tap water, with an excess of 14% of illness due to contaminants introduced in the distribution system (Payment 1991, 1997). Two other studies using randomized, blinded, controlled trials determined that the risks were equal among groups supplied with POU-treated water compared to untreated tap water (Hellard et al. 2001; Colford et al. 2003). For immunocompromised populations, POU water treatment devices are recommended by the CDC and USEPA as one treatment option for reducing risks of Cryptosporidium and other types of infectious agents transmitted by drinking water. Other populations, including those experiencing “normal” life stages such as pregnancy, or those very young or very old, might also benefit from the utilization of additional water treatment options beyond the current multibarrier approach of municipal water treatment.

Massive Microbiological Groundwater Contamination Associated with a Waterborne Outbreak in Lake Erie, South Bass Island, Ohio

Fong, T.-T., et al., Environmental Health Perspectives, June 2007

A groundwater-associated outbreak affected approximately 1,450 residents and visitors of South Bass Island, Ohio, between July and September 2004. To examine the microbiological quality of groundwater wells located on South Bass Island, we sampled 16 wells that provide potable water to public water systems 15–21 September 2004. We tested groundwater wells for fecal indicators, enteric viruses and bacteria, and protozoa (Cryptosporidium and Giardia). The hydrodynamics of Lake Erie were examined to explore the possible surface water–groundwater interactions. All wells were positive for both total coliform and Escherichia coli. Seven wells tested positive for enterococci and Arcobacter (an emerging bacterial pathogen), and F+-specific coliphage was present in four wells. Three wells were positive for all three bacterial indicators, coliphages, and Arcobacter; adenovirus DNA was recovered from two of these wells. We found a cluster of the most contaminated wells at the southeast side of the island. Conclusions: Massive groundwater contamination on the island was likely caused by transport of microbiological contaminants from wastewater treatment facilities and septic tanks to the lake and the subsurface, after extreme precipitation events in May–July 2004. This likely raised the water table, saturated the subsurface, and along with very strong Lake Erie currents on 24 July, forced a surge in water levels and rapid surface water–groundwater interchange throughout the island. Landsat images showed massive influx of organic material and turbidity surrounding the island before the peak of the outbreak. These combinations of factors and information can be used to examine vulnerabilities in other coastal systems. Both wastewater and drinking water issues are now being addressed by the Ohio Environmental Protection Agency and the Ohio Department of Health.

Analysis of Compliance and Characterization of Violations of the Total Coliform Rule

U.S. Environmental Protection Agency, April 2007

Total coliforms have long been used in drinking water regulations as an indicator of the adequacy of water treatment and the integrity of the distribution system. Total coliforms are a group of closely related bacteria that are generally harmless. In drinking water systems, total coliforms react to treatment in a manner similar to most bacterial pathogens and many viral pathogens. Thus, the presence of total coliforms in the distribution system can indicate that the system in also vulnerable to the presence of pathogens in the system. (EPA, June 2001, page 7) Total coliforms are the indicators used in the existing Total Coliform Rule (TCR). EPA is undertaking “a rulemaking process to initiate possible revisions to the TCR. As part of this process, EPA believes it may be appropriate to include this rulemaking in a wider effort to review and address broader issues associated with drinking water distribution systems.” (see Federal Register 68 FR 19030 and 68 FR 42907). Since the promulgation of the TCR, EPA has received stakeholder feedback suggesting modifications to the TCR to reduce the implementation burden. The purpose of this paper is to provide information on the number and frequency of violations of the TCR and to further characterize the frequency with which different types and sizes of systems incur violations. Although EPA explores some statistical testing in this paper, the paper concentrates on presenting the data, as it is, in SDWIS/FED. Information on these frequencies will be useful in supporting several EPA initiatives, particularly the effort to review and possibly revise the TCR. This paper has been undertaken as part of the review of the TCR.


Drowning in Disinfection Byproducts? Assessing Swimming Pool Water

DeMarini, D.M., et al., Environmental Science & Technology, January 2007

Disinfection is mandatory for swimming pools: public pools are usually disinfected by gaseous chlorine or sodium hypochlorite and cartridge filters; home pools typically use stabilized chlorine. These methods produce a variety of disinfection byproducts (DBPs), such as trihalomethanes (THMs), which are regulated carcinogenic DBPs in drinking water that have been detected in the blood and breath of swimmers and of nonswimmers at indoor pools. Also produced are halogenated acetic acids (HAAs) and haloketones, which irritate the eyes, skin, and mucous membranes; trichloramine, which is linked with swimming-pool-associated asthma; and halogenated derivatives of UV sun screens, some of which show endocrine effects. Precursors of DBPs include human body substances, chemicals used in cosmetics and sun screens, and natural organic matter. Analytical research has focused also on the identification of an additional portion of unknown DBPs using gas chromatography (GC)/mass spectrometry (MS) and liquid chromatography (LC)/MS/MS with derivatization. Children swimmers have an increased risk of developing asthma and infections of the respiratory tract and ear. A 1.6-2.0-fold increased risk for bladder cancer has been associated with swimming or showering/bathing with chlorinated water. Bladder cancer risk from THM exposure (all routes combined) was greatest among those with the GSTT1-1 gene. This suggests a mechanism involving distribution of THMs to the bladder by dermal/inhalation exposure and activation there by GSTT1-1 to mutagens. DBPs may be reduced by engineering and behavioral means, such as applying new oxidation and filtration methods, reducing bromide and iodide in the source water, increasing air circulation in indoor pools, and assuring the cleanliness of swimmers. The positive health effects gained by swimming can be increased by reducing the potential adverse health risks.

An approach for developing a national estimate of waterborne disease due to drinking water and a national estimate model application

Messner, M., et al., Journal of Water and Health,  04.suppl 2, July 2006

In this paper, the US Environmental Protection Agency (EPA) presents an approach and a national estimate of drinking water related endemic acute gastrointestinal illness (AGI) that uses information from epidemiologic studies. There have been a limited number of epidemiologic studies that have measured waterborne disease occurrence in the United States. For this analysis, we assume that certain unknown incidence of AGI in each public drinking water system is due to drinking water and that a statistical distribution of the different incidence rates for the population served by each system can be estimated to inform a mean national estimate of AGI illness due to drinking water. Data from public water systems suggest that the incidence rate of AGI due to drinking water may vary by several orders of magnitude. In addition, data from epidemiologic studies show AGI incidence due to drinking water ranging from essentially none (or less than the study detection level) to a rate of 0.26 cases per person-year. Considering these two perspectives collectively, and associated uncertainties, EPA has developed an analytical approach and model for generating a national estimate of annual AGI illness due to drinking water. EPA developed a national estimate of waterborne disease to address, in part, the 1996 Safe Drinking Water Act Amendments. The national estimate uses best available science, but also recognizes gaps in the data to support some of the model assumptions and uncertainties in the estimate. Based on the model presented, EPA estimates a mean incidence of AGI attributable to drinking water of 0.06 cases per year (with a 95% credible interval of 0.02–0.12). The mean estimate represents approximately 8.5% of cases of AGI illness due to all causes among the population served by community water systems. The estimated incidence translates to 16.4 million cases/year among the same population. The estimate illustrates the potential usefulness and challenges of the approach, and provides a focus for discussions of data needs and future study designs. Areas of major uncertainty that currently limit the usefulness of the approach are discussed in the context of the estimate analysis.

Tap Water Linked to Increase in Bladder Cancer

Reynolds, K.A., Water Conditioning & Purification, July 2006

As water treatment professionals, maybe you’ve been alerted to news stories suggesting a connection between tap water consumption and bladder cancer, but are these headlines true or just media hype? Although the most recently reported association of tap water consumption with bladder cancer is indeed based on numerous epidemiological studies with an international scope, all scientific research must be carefully evaluated; not just in terms of the data found, but also for the information possibly missed. The study that has everyone talking again about tap water consumption and its relationship to bladder cancer was published in the International Journal of Cancer (April 2006). Looking at data from six epidemiological studies, conducted in five countries worldwide (Canada, Finland, France, Italy and two in the United States), a significant association was found between tap water consumption and bladder cancer among men. The risk increased with consumption of greater volumes, suggesting that carcinogenic chemicals in tap water were responsible for the increased risk. While the information presented appears to be sound, it is important to understand the limitations of the study approach so that the data can be appropriately analyzed with respect to public health significance.

Despite a gender bias and inconsistent reports in the historical literature, this study seems to have sturdy legs to stand on or to at least justify continued research. As mentioned earlier, epidemiology is not a very sensitive science and is complicated by unknown confounders. In addition, this study provides no evidence as to what specific factors related to tap water are causing an increase in cancer, where other drinking water sources (i.e., bottled water) show no association. Water is clearly a heterogeneous mix of contaminants, with vast geographical and temporal fluctuations. Little is known about the combined effects of multiple contaminants found in drinking water, thus a study of single contaminants and their association with cancer risks would not provide a complete picture of overall exposures.

Volatile Organic Compounds in the Nation’s Drinking-Water Supply Wells – What Findings May Mean to Human Health

U.S. Geological Survey, June 2006

When volatile organic compounds (VOCs) are detected in samples from drinking-water supply wells, it is important to understand what these results may mean to human health. As a first step toward understanding VOC occurrence in the context of human health, a screening-level assessment was conducted by comparing VOC concentrations to human-health benchmarks. One sample from each of 3,497 domestic and public wells was analyzed for 55 VOCs; samples were collected prior to treatment or blending. At least one VOC was detected in 623 well samples (about 18 percent of all well samples) at a threshold of 0.2 part per billion. Eight of the 55 VOCs had concentrations greater than human-health benchmarks in 45 well samples (about 1 percent of all well samples); these concentrations may be of potential human-health concern if the water were to be ingested without treatment for many years. VOC concentrations were less than human-health benchmarks in most well samples with VOC detections, indicating that adverse effects are unlikely to occur, even if water with such concentrations were to be ingested over a lifetime. Seventeen VOCs may warrant further investigation because their concentrations were greater than, or approached, human-health benchmarks.

An Approach for Developing a National Estimate Of Waterborne Disease Due to Drinking Water and a National Estimate Model Application

Michael Messner, Susan Shaw, Stig Regli, Ken Rotert, Valerie Blank and Jeff Soller, Journal of Water and Health, 2006;04.suppl2:201-40

In this paper, the US Environmental Protection Agency (EPA) presents an approach and a national estimate of drinking water related endemic acute gastrointestinal illness (AGI) that uses information from epidemiologic studies. There have been a limited number of epidemiologic studies that have measured waterborne disease occurrence in the United States. For this analysis, we assume that certain unknown incidence of AGI in each public drinking water system is due to drinking water and that a statistical distribution of the different incidence rates for the population served by each system can be estimated to inform a mean national estimate of AGI illness due to drinking water. Data from public water systems suggest that the incidence rate of AGI due to drinking water may vary by several orders of magnitude. In addition, data from epidemiologic studies show AGI incidence due to drinking water ranging from essentially none (or less than the study detection level) to a rate of 0.26 cases per person-year. Considering these two perspectives collectively, and associated uncertainties, EPA has developed an analytical approach and model for generating a national estimate of annual AGI illness due to drinking water. EPA developed a national estimate of waterborne disease to address, in part, the 1996 Safe Drinking Water Act Amendments. The national estimate uses best available science, but also recognizes gaps in the data to support some of the model assumptions and uncertainties in the estimate. Based on the model presented, EPA estimates a mean incidence of AGI attributable to drinking water of 0.06 cases per year (with a 95% credible interval of 0.02–0.12). The mean estimate represents approximately 8.5% of cases of AGI illness due to all causes among the population served by community water systems. The estimated incidence translates to 16.4 million cases/year among the same population. The estimate illustrates the potential usefulness and challenges of the approach, and provides a focus for discussions of data needs and future study designs. Areas of major uncertainty that currently limit the usefulness of the approach are discussed in the context of the estimate analysis.

Analysis of Bromate and Bromide in Blood

Quinones, O., Snyder, S.A., Cotruvo, J.A., Fisher, J.W., Toxicology, April 2006

Bromate is a regulated disinfection byproduct primarily associated with the ozonation of water containing bromide, but also is a byproduct of hypochlorite used to disinfect water. To study the pharmacokinetics of bromate, it is necessary to develop a robust and sensitive analytical method for the identification and quantitation of bromate in blood. A critical issue is the extent to which bromate is degraded presystemically and in blood at low (environmentally relevant) doses of ingested bromate as it is delivered to target tissue. A simple isolation procedure was developed using blood plasma spiked with various levels of bromate and bromide. Blood proteins and lipids were precipitated from plasma using acetonitrile. The resulting extracts were analyzed by ion-chromatography with inductively-coupled plasma mass spectrometry (IC-ICP/MS), with a method reporting limit of 5 ng/mL plasma for both bromate and bromide. Plasma samples purchased commercially were spiked with bromate and stored up to 7 days. Over the 7 day storage period, bromate decay remained under 20% for two spike doses. Decay studies in plasma samples from spiked blood drawn from live rats showed significant bromate decay within short periods of time preceding sample freezing, although samples which were spiked, centrifuged and frozen immediately after drawing yielded excellent analytical recoveries.

Research Strategy for Developing Key Information on Bromate’s Mode of Action

Bull, R.J. and Cotruvo, J.A., Toxicology, April 2006

Bromate is produced when ozone is used to treat waters that contain trace amounts of bromide ion. It is also a contaminant of hypochlorite solutions produced by electrolysis of salt that contains bromide. Both ozone and hypochlorite are extensively used to disinfect drinking water, a process that is credited with reducing the incidence of waterborne infections diseases around the world. In studies on experimental animals, bromate has been consistently demonstrated to induce cancer, although there is evidence of substantial species differences in sensitivity (rat > mouse > hamster). There are no data to indicate bromate is carcinogenic in humans. An issue that is critical to the continued use of ozone as a disinfectant for drinking water in bromide-containing waters depends heavily on whether current predictions of carcinogenic risk based on carcinogenic responses in male rats treated with bromate are accurate at the much lower exposure levels of humans. Thiol-dependent oxidative damage to guanine in DNA is a plausible mode of action for bromate-induced cancer. However, other mechanisms may contribute to the response, including the accumulation of α2u-globulin in the kidney of the male rat. To provide direction to institutions that have an interest in clarifying the toxicological risks that bromate in drinking water might pose, a workshop funded by the Awwa Research Foundation was convened to lay out a research strategy that, if implemented, could clarify this important public health issue. The technical issues that underlie the deliberations of the workshop are provided in a series of technical papers. The present manuscript summarizes the conclusions of the workgroup with respect to the type and timing of research that should be conducted. The research approach is outlined in four distinct phases that lay out alternative directions as the research plan is implemented. Phase I is designed to quantify pre-systemic degradation, absorption, distribution, and metabolism of bromate and to associate these with key events for the induction of cancer and develop an initial pharmacokinetic (PK) model based on preliminary studies. Phase II will be implemented if it appears that there is a linear relationship between external dose and key event responses and is designed to gather carcinogenesis data in female rats in the absence of α2u-globulin-induced nephropathy which the workgroup concluded was a probable contributor to the responses observed in the male rats for which detailed dose–response data were collected. If the key events and external dosimetry are found not to be linear in Phase I, Phase III is initiated with a screening study of the auditory toxicity of bromate to determine if it is likely to be exacerbated by chronic exposure. If this occurs, auditory toxicity will be further evaluated in Phase IV. If auditory toxicity is determined unlikely to occur, an alternative chronic study in female rats to the one identified in Phase II will be implemented to include exposure in utero. This was recommended to address the possibility that the fetus may be more susceptible. One of the three options are to be implemented in Phase IV depending upon whether preliminary data indicated that chronic auditory toxicity, reproductive and/or developmental toxicities, or a combination of these outcomes is necessary to characterize the toxicology of low dose exposures to bromate. Each phase of the research will be accompanied by further development of pharmacokinetic models to guide collection of appropriate data to meet the needs of the more sophisticated studies. It is suggested that a Bayesian approach be utilized to develop a final risk model based upon measurement of prior observations from the Phase I studies and the set of posterior observations that would be obtained from whichever chronic study is conducted.

  • Bromate;
  • Research to improve risk assessment;
  • Drinking water

Experimental Results from the Reaction of Bromate Ion with Synthetic and Real Gastric Juices

Keith, J.D., Pacey, G.E., Cotruvo, J.A., Gordon,G., Toxicology, February 2006

This study was designed to identify and quantify the effects of reducing agents on the rate of bromate ion reduction in real and synthetic gastric juice. This could be the first element in the sequence of a pharmacokinetic description of the fate of bromate ion entering the organism, being metabolized, and subsequently being tracked through the system to the target cell or eliminated. Synthetic gastric juice containing H+ and Cl did exhibit reduced bromate ion levels, but at a rate that was too slow for a significant amount of bromate to be reduced under typical stomach retention time conditions. The reaction orders for Cl and H+ were 1.50 and 2.0, respectively. Addition of the reducing agents hydrogen sulfide (which was shown to be present and quantified in real gastric juice), glutathione, and/or cysteine increased the rate of bromate ion loss. All of the reactions showed significant pH effects. Half-lives as short as 2 min were measured for bromate ion reduction in 0.17 M H+ and Cl and 10−4 M H2S. Therefore, the lifetime of bromate ion in solutions containing typical gastric juice concentrations of H+, Cl, and H2S is 20–30 min. This rate should result in as much as a 99% reduction of bromate ion during its residence in the stomach. Bromate ion reduction in real gastric juice occurred at a rapid rate. A comparison of real and synthetic gastric juice containing H+, Cl, cysteine, glutathione, and hydrogen sulfide showed that the component most responsible for the considerable decrease of the concentration of bromate ion in the stomach is hydrogen sulfide.

  • Bromate;
  • Gastric juice;
  • Ion chromatography;
  • Hydrogen sulfide