Effective Immediately: Healthcare Facilities Required to Reduce Legionellosis Risks from Tap Water

Published July 2017

By Kelly A. Reynolds, MSPH, PhD

If you follow On Tap frequently, you know that the bacterium, Legionella, has been a repeated topic in recent years. Once again, Legionella is at the forefront of discussions due to continuing waterborne outbreaks and new directives in healthcare facilities for prevention. On June 2, the Department of Health and Human Services, Centers for Medicare and Medicaid Services (CMS) issued a memo that will undoubtedly expand the awareness of Legionella risks and further drive the implementation of preventative approaches.

Nationwide reconnaissance of contaminants of emerging concern in source and treated drinking waters of the United States

Glassmeyer, S.T., et al., Science of The Total Environment, 581-582:909-922, March 2017

When chemical or microbial contaminants are assessed for potential effect or possible regulation in ambient and drinking waters, a critical first step is determining if the contaminants occur and if they are at concentrations that may cause human or ecological health concerns. To this end, source and treated drinking water samples from 29 drinking water treatment plants (DWTPs) were analyzed as part of a two-phase study to determine whether chemical and microbial constituents, many of which are considered contaminants of emerging concern, were detectable in the waters. Of the 84 chemicals monitored in the 9 Phase I DWTPs, 27 were detected at least once in the source water, and 21 were detected at least once in treated drinking water. In Phase II, which was a broader and more comprehensive assessment, 247 chemical and microbial analytes were measured in 25 DWTPs, with 148 detected at least once in the source water, and 121 detected at least once in the treated drinking water. The frequency of detection was often related to the analyte’s contaminant class, as pharmaceuticals and anthropogenic waste indicators tended to be infrequently detected and more easily removed during treatment, while per and polyfluoroalkyl substances and inorganic constituents were both more frequently detected and, overall, more resistant to treatment. The data collected as part of this project will be used to help inform evaluation of unregulated contaminants in surface water, groundwater, and drinking water.

Characterizing pharmaceutical, personal care product, and hormone contamination in a karst aquifer of southwestern Illinois, USA, using water quality and stream flow parameters

Dodgen, L.K., et.al., Science of the Total Environment, 578:281-289, February 2017

Karst aquifers are drinking water sources for 25% of the global population. However, the unique geology of karst areas facilitates rapid transfer of surficial chemicals to groundwater, potentially contaminating drinking water. Contamination of karst aquifers by nitrate, chloride, and bacteria have been previously observed, but little knowledge is available on the presence of contaminants of emerging concern (CECs), such as pharmaceuticals. Over a 17-month period, 58 water samples were collected from 13 sites in the Salem Plateau, a karst region in southwestern Illinois, United States. Water was analyzed for 12 pharmaceutical and personal care products (PPCPs), 7 natural and synthetic hormones, and 49 typical water quality parameters (e.g., nutrients and bacteria). Hormones were detected in only 23% of samples, with concentrations of 2.2–9.1 ng/L. In contrast, PPCPs were quantified in 89% of groundwater samples. The two most commonly detected PPCPs were the antimicrobial triclocarban, in 81% of samples, and the cardiovascular drug gemfibrozil, in 57%. Analytical results were combined with data of local stream flow, weather, and land use to 1) characterize the extent of aquifer contamination by CECs, 2) cluster sites with similar PPCP contamination profiles, and 3) develop models to describe PPCP contamination. Median detection in karst groundwater was 3 PPCPs at a summed concentration of 4.6 ng/L. Sites clustered into 3 subsets with unique contamination models. PPCP contamination in Cluster I sites was related to stream height, manganese, boron, and heterotrophic bacteria. Cluster II sites were characterized by groundwater temperature, specific conductivity, sodium, and calcium. Cluster III sites were characterized by dissolved oxygen and barium. Across all sites, no single or small set of water quality factors was significantly predictive of PPCP contamination, although gemfibrozil concentrations were strongly related to the sum of PPCPs in karst groundwater.

A decision analysis framework for estimating the potential hazards for drinking water resources of chemicals used in hydraulic fracturing fluids

Yost, E.E., Science of the Total Environment, 574:1544-1558, January 2017

Despite growing concerns over the potential for hydraulic fracturing to impact drinking water resources, there are limited data available to identify chemicals used in hydraulic fracturing fluids that may pose public health concerns. In an effort to explore these potential hazards, a multi-criteria decision analysis (MCDA) framework was employed to analyze and rank selected subsets of these chemicals by integrating data on toxicity, frequency of use, and physicochemical properties that describe transport in water. Data used in this analysis were obtained from publicly available databases compiled by the United States Environmental Protection Agency (EPA) as part of a larger study on the potential impacts of hydraulic fracturing on drinking water. Starting with nationwide hydraulic fracturing chemical usage data from EPA’s analysis of the FracFocus Chemical Disclosure Registry 1.0, MCDAs were performed on chemicals that had either noncancer toxicity values (n = 37) or cancer-specific toxicity values (n = 10). The noncancer MCDA was then repeated for subsets of chemicals reported in three representative states (Texas, n = 31; Pennsylvania, n = 18; and North Dakota, n = 20). Within each MCDA, chemicals received scores based on relative toxicity, relative frequency of use, and physicochemical properties (mobility in water, volatility, persistence). Results show a relative ranking of these chemicals based on hazard potential, and provide preliminary insight into chemicals that may be more likely than others to impact drinking water resources. Comparison of nationwide versus state-specific analyses indicates regional differences in the chemicals that may be of more concern to drinking water resources, although many chemicals were commonly used and received similar overall hazard rankings. Several chemicals highlighted by these MCDAs have been reported in groundwater near areas of hydraulic fracturing activity. This approach is intended as a preliminary analysis, and represents one possible method for integrating data to explore potential public health impacts.

A national reconnaissance of trace organic compounds (TOCs) in United States lotic ecosystems

Bernot, M.J., et al., Science of the Total Environment, 572:422-433, December 2016

We collaborated with 26 groups from universities across the United States to sample 42 sites for 33 trace organic compounds (TOCs) in water and sediments of lotic ecosystems. Our goals were 1) to further develop a national database of TOC abundance in United States lotic ecosystems that can be a foundation for future research and management, and 2) to identify factors related to compound abundance. Trace organic compounds were found in 93% of water samples and 56% of sediment samples. Dissolved concentrations were 10–1000 × higher relative to sediment concentrations. The ten most common compounds in water samples with detection frequency and maximum concentration were sucralose (87.5%, 12,000 ng/L), caffeine (77.5%, 420 ng/L), sulfamethoxazole (70%, 340 ng/L), cotinine (65%, 130 ng/L), venlafaxine (65%, 1800 ng/L), carbamazepine (62.5%, 320 ng/L), triclosan (55%, 6800 ng/L), azithromycin (15%, 970 ng/L), diphenylhydramine (40%, 350 ng/L), and desvenlafaxine (35%, 4600 ng/L). In sediment, the most common compounds were venlafaxine (32.5%, 19 ng/g), diphenhydramine (25%, 41 ng/g), azithromycin (15%, 11 ng/g), fluoxetine (12.5%, 29 ng/g) and sucralose (12.5%, 16 ng/g). Refractory compounds such as sucralose may be good indicators of TOC contamination in lotic ecosystems, as there was a correlation between dissolved sucralose concentrations and with the total number of compounds detected in water. Discharge and human demographic (population size) characteristics were not good predictors of compound abundance in water samples. This study further confirms the ubiquity of TOCs in lotic ecosystems. Although concentrations measured rarely approached acute aquatic-life criteria, the chronic effects, bioaccumulative potential, or potential mixture effects of multiple compounds are relatively unknown.

Atrazine in Kentucky drinking water: intermethod comparison of U.S. environmental protection agency analytical methods 507 and 508.1

Suhl, J., et al., Journal of Environmental Health, 79(5):E1-E6, December 2016

This study examines the analytical methods used to test drinking water for atrazine along with the seasonal variation of atrazine in drinking water. Samples from 117 counties throughout Kentucky from January 2000 to December 2008 were analyzed. Methods 507 and 508.1 were compared using the Mann-Whitney U test. Median values of these methods were similar (p = .7421). To examine seasonal variation, data from each year and from the entire period were analyzed using one-way ANOVA; pairwise multiple comparisons were made for years with significant differences. All the years except 2001, 2005, 2006, and 2007 had significantly different atrazine concentrations between seasons. The Seasonal Kendall Test for Trend was used to identify trends in atrazine over time. Yearly means ranged from 0.000043 mg/L (± 0.000011 mg/L) to 0.000995 mg/L (± 0.000510 mg/L). The highest levels were observed during spring in most years. A significant (p = .000092) decreasing trend of -7.6 x 10-6 mg/L/year was found. Decreasing trends were also present in all five regions of the state during this period. This study illustrates the need for changes in sampling methodology used today, so that effective exposure assessments can be conducted to study the public’s exposure to atrazine in drinking water.

Widespread copper and lead contamination of household drinking water, New South Wales, Australia

Harvey, P.J., et.al., Environmental Research, 151:275-285, November 2016

This study examines arsenic, copper, lead and manganese drinking water contamination at the domestic consumer’s kitchen tap in homes of New South Wales, Australia. Analysis of 212 first draw drinking water samples shows that almost 100% and 56% of samples contain detectable concentrations of copper and lead, respectively. Of these detectable concentrations, copper exceeds Australian Drinking Water Guidelines (ADWG) in 5% of samples and lead in 8%. By contrast, no samples contained arsenic and manganese water concentrations in excess of the ADWG. Analysis of household plumbing fittings (taps and connecting pipework) show that these are a significant source of drinking water lead contamination. Water lead concentrations derived for plumbing components range from 108µg/L to 1440µg/L (n=28, mean – 328µg/L, median – 225µg/L). Analysis of kitchen tap fittings demonstrates these are a primary source of drinking water lead contamination (n=9, mean – 63.4µg/L, median – 59.0µg/L). The results of this study demonstrate that along with other potential sources of contamination in households, plumbing products that contain detectable lead up to 2.84% are contributing to contamination of household drinking water. Given that both copper and lead are known to cause significant health detriments, products for use in contact with drinking water should be manufactured free from copper and lead.

Pb-Sr isotopic and geochemical constraints on sources and processes of lead contamination in well waters and soil from former fruit orchards, Pennsylvania, USA: A legacy of anthropogenic activities

Ayuso, R.A., and Foley, N.K., Journal of Geochemical Exploration, 170:125-147, November 2016

Isotopic discrimination can be an effective tool in establishing a direct link between sources of Pb contamination and the presence of anomalously high concentrations of Pb in waters, soils, and organisms. Residential wells supplying water containing up to 1600 ppb Pb to houses built on the former Mohr orchards commercial site, near Allentown, PA, were evaluated to discern anthropogenic from geogenic sources. Pb (n = 144) and Sr (n = 40) isotopic data and REE (n = 29) data were determined for waters from residential wells, test wells (drilled for this study), and surface waters from pond and creeks. Local soils, sediments, bedrock, Zn-Pb mineralization and coal were also analyzed (n = 94), together with locally used Pb-As pesticide (n = 5). Waters from residential and test wells show overlapping values of 206Pb/207Pb, 208Pb/207Pb and 87Sr/86Sr. Larger negative Ce anomalies (Ce/Ce*) distinguish residential wells from test wells. Results show that residential and test well waters, sediments from residential water filters in water tanks, and surface waters display broad linear trends in Pb isotope plots. Pb isotope data for soils, bedrock, and pesticides have contrasting ranges and overlapping trends. Contributions of Pb from soils to residential well waters are limited and implicated primarily in wells having shallow water-bearing zones and carrying high sediment contents. Pb isotope data for residential wells, test wells, and surface waters show substantial overlap with Pb data reflecting anthropogenic actions (e.g., burning fossil fuels, industrial and urban processing activities). Limited contributions of Pb from bedrock, soils, and pesticides are evident. High Pb concentrations in the residential waters are likely related to sediment build up in residential water tanks. Redox reactions, triggered by influx of groundwater via wells into the residential water systems and leading to subtle changes in pH, are implicated in precipitation of Fe oxyhydroxides, oxidative scavenging of Ce(IV), and desorption and release of Pb into the residential water systems. The Pb isotope features in the residences and the region are best interpreted as reflecting a legacy of industrial Pb present in underlying aquifers that currently supply the drinking water wells.

Malodorous volatile organic sulfur compounds: Sources, sinks and significance in inland waters

Watson, S.B, and Jüttner, F., Critical Reviews in Microbiology, 43(2):210-237, November 2016

Volatile Organic Sulfur Compounds (VOSCs) are instrumental in global S-cycling and greenhouse gas production. VOSCs occur across a diversity of inland waters, and with widespread eutrophication and climate change, are increasingly linked with malodours in organic-rich waterbodies and drinking-water supplies. Compared with marine systems, the role of VOSCs in biogeochemical processes is far less well characterized for inland waters, and often involves different physicochemical and biological processes. This review provides an updated synthesis of VOSCs in inland waters, focusing on compounds known to cause malodours. We examine the major limnological and biochemical processes involved in the formation and degradation of alkylthiols, dialkylsulfides, dialkylpolysulfides, and other organosulfur compounds under different oxygen, salinity and mixing regimes, and key phototropic and heterotrophic microbial producers and degraders (bacteria, cyanobacteria, and algae) in these environs. The data show VOSC levels which vary significantly, sometimes far exceeding human odor thresholds, generated by a diversity of biota, biochemical pathways, enzymes and precursors. We also draw attention to major issues in sampling and analytical artifacts which bias and preclude comparisons among studies, and highlight significant knowledge gaps that need addressing with careful, appropriate methods to provide a more robust understanding of the potential effects of continued global development.

Occurrence of DBPs in Drinking Water of European Regions for Epidemiology Studies

Krasner, S.J., et.al., American Water Works Association Journal, 108(10):501-512, October 2016

A three-year study was conducted on the occurrence of disinfection by-products (DBPs) – trihalomethanes (THMs), haloacetic acids (HAAs), and haloacetonitriles – in drinking water of regions of Europe where epidemiology studies were being carried out. Thirteen systems in six countries (i.e., Italy, France, Greece, Lithuania, Spain, United Kingdom) were sampled. Typically chlorinated DBPs dominated. However, in most of Catalonia (Spain) and in Heraklion (Greece), brominated DBPs dominated. The degree of bromine incorporation into the DBP classes was in general similar among them. This is important, as brominated DBPs are a greater health concern. In parts of Catalonia, the reported levels of tribromoacetic acid were higher than in other parts of the world. In some regions, the levels of HAAs tended to be peaked in concentration in a different time period than when the levels of THMs peaked. In most epidemiology studies, THMs are used as a surrogate for other halogenated DBPs. This study provides exposure assessment information for epidemiology studies.

Origin of Hexavalent Chromium in Drinking Water Wells from the Piedmont Aquifers of North Carolina

Vengosh, A., et.al., Environmental Science & Techonology Letters, 3(12):409-414, October 2016

Hexavalent chromium [Cr(VI)] is a known pulmonary carcinogen. Recent detection of Cr(VI) in drinking water wells in North Carolina has raised public concern about contamination of drinking water wells by nearby coal ash ponds. Here we report, for the first time, the prevalence of Cr and Cr(VI) in drinking water wells from the Piedmont region of central North Carolina, combined with a geochemical analysis to determine the source of the elevated Cr(VI) levels. We show that Cr(VI) is the predominant species of dissolved Cr in groundwater and elevated levels of Cr and Cr(VI) are found in wells located both near and far (>30 km) from coal ash ponds. The geochemical characteristics, including the overall chemistry, boron to chromium ratios, and strontium isotope (87Sr/86Sr) variations in groundwater with elevated Cr(IV) levels, are different from those of coal ash leachates. Alternatively, the groundwater chemistry and Sr isotope variations are consistent with water–rock interactions as the major source for Cr(VI) in groundwater. Our results indicate that Cr(VI) is most likely naturally occurring and ubiquitous in groundwater from the Piedmont region in the eastern United States, which could pose health risks to residents in the region who consume well water as a major drinking water source.

The precautionary principle and chemicals management: The example of perfluoroalkyl acids in groundwater

Cousins, I.T., et.al., Environment International, 94:331-340, September 2016

Already in the late 1990s microgram-per-liter levels of perfluorooctane sulfonate (PFOS) were measured in water samples from areas where fire-fighting foams were used or spilled. Despite these early warnings, the problems of groundwater, and thus drinking water, contaminated with perfluoroalkyl and polyfluoroalkyl substances (PFASs) including PFOS are only beginning to be addressed. It is clear that this PFAS contamination is poorly reversible and that the societal costs of clean-up will be high. This inability to reverse exposure in a reasonable timeframe is a major motivation for application of the precautionary principle in chemicals management. We conclude that exposure can be poorly reversible; 1) due to slow elimination kinetics in organisms, or 2) due to poorly reversible environmental contamination that leads to continuous exposure. In the second case, which is relevant for contaminated groundwater, the reversibility of exposure is not related to the magnitude of a chemical’s bioaccumulation potential. We argue therefore that all PFASs entering groundwater, irrespective of their perfluoroalkyl chain length and bioaccumulation potential, will result in poorly reversible exposures and risks as well as further clean-up costs for society. To protect groundwater resources for future generations, society should consider a precautionary approach to chemicals management and prevent the use and release of highly persistent and mobile chemicals such as PFASs.

Drinking water lead regulations: impact on the brass value chain

Estelle, A.A., Materials Science and Technology, 32(17):1763-1770, August 2016

A detailed review of regulations restricting the use of lead in potable water systems is provided in several regions including the United States (U.S.), Canada, the European Union (E.U.) and Japan to assess the impact on the brass value chain. Covered topics include: chronology of regulations, governing bodies, compliance requirements, enforcement mechanisms and other aspects relevant to metal suppliers, original equipment manufacturers, designers, specifiers, end-users and recyclers of brass. The development and use of lead-free brass alloys and how these materials have impacted manufacturing and recycling processes is also addressed.

Temporal variation in groundwater quality in the Permian Basin of Texas, a region of increasing unconventional oil and gas development

Hildenbrand, Z.L., et.al., Science of the Total Environment, 562:906-913, August 2016

The recent expansion of natural gas and oil extraction using unconventional oil and gas development (UD) practices such as horizontal drilling and hydraulic fracturing has raised questions about the potential for environmental impacts. Prior research has focused on evaluations of air and water quality in particular regions without explicitly considering temporal variation; thus, little is known about the potential effects of UD activity on the environment over longer periods of time. Here, we present an assessment of private well water quality in an area of increasing UD activity over a period of 13 months. We analyzed samples from 42 private water wells located in three contiguous counties on the Eastern Shelf of the Permian Basin in Texas. This area has experienced a rise in UD activity in the last few years, and we analyzed samples in four separate time points to assess variation in groundwater quality over time as UD activities increased. We monitored general water quality parameters as well as several compounds used in UD activities. We found that some constituents remained stable over time, but others experienced significant variation over the period of study. Notable findings include significant changes in total organic carbon and pH along with ephemeral detections of ethanol, bromide, and dichloromethane after the initial sampling phase. These data provide insight into the potentially transient nature of compounds associated with groundwater contamination in areas experiencing UD activity.

Detection of Poly- and Perfluoroalkyl Substances (PFASs) in U.S. Drinking Water Linked to Industrial Sites, Military Fire Training Areas, and Wastewater Treatment Plants

Andrews, D.Q., et.al., Environmental Science & Technologies Letters, August 2016

Drinking water contamination with poly- and perfluoroalkyl substances (PFASs) poses risks to the developmental, immune, metabolic, and endocrine health of consumers. We present a spatial analysis of 2013–2015 national drinking water PFAS concentrations from the U.S. Environmental Protection Agency’s (US EPA) third Unregulated Contaminant Monitoring Rule (UCMR3) program. The number of industrial sites that manufacture or use these compounds, the number of military fire training areas, and the number of wastewater treatment plants are all significant predictors of PFAS detection frequencies and concentrations in public water supplies. Among samples with detectable PFAS levels, each additional military site within a watershed’s eight-digit hydrologic unit is associated with a 20% increase in PFHxS, a 10% increase in both PFHpA and PFOA, and a 35% increase in PFOS. The number of civilian airports with personnel trained in the use of aqueous film-forming foams is significantly associated with the detection of PFASs above the minimal reporting level. We find drinking water supplies for 6 million U.S. residents exceed US EPA’s lifetime health advisory (70 ng/L) for PFOS and PFOA. Lower analytical reporting limits and additional sampling of smaller utilities serving <10000 individuals and private wells would greatly assist in further identifying PFAS contamination sources.

Cyto- and genotoxic profile of groundwater used as drinking water supply before and after disinfection

Pellacani, C., et.al., Journal of Water and Health, 14(6):901-913, July 2016

The assessment of the toxicological properties of raw groundwater may be useful to predict the type and quality of tap water. Contaminants in groundwater are known to be able to affect the disinfection process, resulting in the formation of substances that are cytotoxic and/or genotoxic. Though the European directive (98/83/EC, which establishes maximum levels for contaminants in raw water (RW)) provides threshold levels for acute exposition to toxic compounds, the law does not take into account chronic exposure at low doses of pollutants, present in complex mixture. The purpose of this study was to evaluate the cyto- and genotoxic load in groundwater of two water treatment plants in Northern Italy. Water samples induced cytotoxic effects, mainly observed when human cells were treated with RW. Moreover, results indicated that the disinfection process reduced cell toxicity, independent of the biocidal used. The induction of genotoxic effects was found, in particular, when the Micronucleous assay was carried out on raw groundwater. These results suggest that it is important to include bio-toxicological assays as additional parameters in water quality monitoring programs, as their use would allow the evaluation of the potential risk of groundwater for humans.

Emerging contaminant uncertainties and policy: The chicken or the egg conundrum

Naidu, R., et.al., Chemosphere, 154:385-390, July 2016

Best practice in regulating contaminants of emerging concern (CEC) must involve the integration of science and policy, be defensible and accepted by diverse stakeholders. Key elements of CEC frameworks include identification and prioritisation of emerging contaminants, evaluation of health and environmental impacts from key matrices such as soil, groundwater, surface waters and sediment, assessments of available data, methods and technologies (and limitations), and mechanisms to take cognisance of diverse interests. This paper discusses one of the few frameworks designed for emerging contaminants, the Minnesota Department of Health (MDH) Drinking Water Contaminants of Emerging Concern (CEC) program. Further review of mechanisms for CECs in other jurisdictions reveals that there is only a small number of regulatory and guidance regimes globally. There is also merit in a formal mechanism for the global exchange of knowledge and outcomes associated with CECs of global interest.

Emerging contaminants in the environment: Risk-based analysis for better management.

Naidu, R., et.al., Chemosphere, 154:350-357, July 2016

Emerging contaminants (ECs) are chemicals of a synthetic origin or deriving from a natural source that has recently been discovered and for which environmental or public health risks are yet to be established. This is due to limited available information on their interaction and toxicological impacts on receptors. Several types of ECs exist such as antibiotics, pesticides, pharmaceuticals, personal care products, effluents, certain naturally occurring contaminants and more recently nanomaterials. ECs may derive from a known source, for example released directly to the aquatic environment from direct discharges such as those from wastewater treatment plants. Although in most instances the direct source cannot be identified, ECs have been detected in virtually every country’s natural environment and as a consequence they represent a global problem. There is very limited information on the fate and transport of ECs in the environment and their toxicological impact. This lack of information can be attributed to limited financial resources and the lack of analytical techniques for detecting their effects on ecosystems and human health on their own or as mixture. We do not know how ECs interact with each other or various contaminants. This paper presents an overview of existing knowledge on ECs, their fate and transport and a risk-based analysis for ECs management and complementary strategies.

Potential corrosivity of untreated groundwater in the United States

Belitz, K., et.al., U.S. Geological Survey Scientific Investigations Report 2016-5092, July 2016

Corrosive groundwater, if untreated, can dissolve lead and other metals from pipes and other components in water distribution systems. Two indicators of potential corrosivity—the Langelier Saturation Index (LSI) and the Potential to Promote Galvanic Corrosion (PPGC)—were used to identify which areas in the United States might be more susceptible to elevated concentrations of metals in household drinking water and which areas might be less susceptible. On the basis of the LSI, about one-third of the samples collected from about 21,000 groundwater sites are classified as potentially corrosive. On the basis of the PPGC, about two-thirds of the samples collected from about 27,000 groundwater sites are classified as moderate PPGC, and about one-tenth as high PPGC. Potentially corrosive groundwater occurs in all 50 states and the District of Columbia. National maps have been prepared to identify the occurrence of potentially corrosive groundwater in the 50 states and the District of Columbia. Eleven states and the District of Columbia were classified as having a very high prevalence of potentially corrosive groundwater, 14 states as having a high prevalence of potentially corrosive groundwater, 19 states as having a moderate prevalence of potentially corrosive groundwater, and 6 states as having a low prevalence of potentially corrosive groundwater. These findings have the greatest implication for people dependent on untreated groundwater for drinking water, such as the 44 million people that are self-supplied and depend on domestic wells or springs for their water supply.

Human Health Risk Assessment of Chromium in Drinking Water: A Case Study of Sukinda Chromite Mine, Odisha, India

Naz, A., et.al., Exposure and Health, 8(2):253-264, June 2016

The present study aims to evaluate human health risk of Cr(VI) and Cr(III) via oral and dermal exposure of drinking water in groundwater samples of nearby Sukinda chromite mine. The risk assessment of each location was carried out using mathematical models as per IRIS guidelines and the input parameters were taken according to the Indian context. The concentrations of TCr and Cr(VI) were found in the range of 48.7–250.2 and 21.4–115.2 μg/l, respectively. In the maximum locations, TCr and Cr(VI) concentrations were found 2.3–6 times and 2.1–11.5 times higher, respectively, than the permissible limit as per standard statutory bodies. The total cumulative average cancer risk and non-cancer risk (Hazard Quotient) was found 2.04E−03 and 1.37 in male and 1.73E−03 and 1.16 in the female population, respectively, which indicated ‘very high’ cancer risk and ‘medium’ non-cancer risk as per USEPA guideline. Male population was found 1.2 times higher cancer and non-cancer risk than females, because of the higher water ingestion rate in male. The obtained health risk via dermal route was found 6 times lesser than the oral ingestion due to very less dermal exposure time (0.58 h/days). As a consequence, ‘high’ cancer risk also recorded in one of the locations where TCr concentration was within permissible limit which is because of the higher proportion of bioavailable Cr(VI). Sensitivity analysis of input parameters towards cancer and non-cancer risk revealed that Cr(VI) and Cr(III) concentrations were the main predominant parameters followed by exposure duration, body weight, average time, and dermal slope factor.

Vulnerability of drinking water supplies to engineered nanoparticles

Troester, M., et.al., Water Research, 96:255-279, June 2016

The production and use of engineered nanoparticles (ENPs) inevitably leads to their release into aquatic environments, with the quantities involved expected to increase significantly in the future. Concerns therefore arise over the possibility that ENPs might pose a threat to drinking water supplies. Investigations into the vulnerability of drinking water supplies to ENPs are hampered by the absence of suitable analytical methods that are capable of detecting and quantifiying ENPs in complex aqueous matrices. Analytical data concerning the presence of ENPs in drinking water supplies is therefore scarce. The eventual fate of ENPs in the natural environment and in processes that are important for drinking water production are currently being investigated through laboratory based-experiments and modelling. Although the information obtained from these studies may not, as yet, be sufficient to allow comprehensive assessment of the complete life-cycle of ENPs, it does provide a valuable starting point for predicting the significance of ENPs to drinking water supplies. This review therefore addresses the vulnerability of drinking water supplies to ENPs. The risk of ENPs entering drinking water is discussed and predicted for drinking water produced from groundwater and from surface water. Our evaluation is based on reviewing published data concerning ENP production amounts and release patterns, the occurrence and behavior of ENPs in aquatic systems relevant for drinking water supply and ENP removability in drinking water purification processes. Quantitative predictions are made based on realistic high-input case scenarios. The results of our synthesis of current knowledge suggest that the risk probability of ENPs being present in surface water resources is generally limited, but that particular local conditions may increase the probability of raw water contamination by ENPs. Drinking water extracted from porous media aquifers are not generally considered to be prone to ENP contamination. In karstic aquifers, however, there is an increased probability that if any ENPs enter the groundwater system they will reach the extraction point of a drinking water treatment plant (DWTP). The ability to remove ENPs during water treatment depends on the specific design of the treatment process. In conventional DWTPs with no flocculation step a proportion of ENPs, if present in the raw water, may reach the final drinking water. The use of ultrafiltration techniques improves drinking water safety with respect to ENP contamination.

The Flint Water Crisis Confirms That U.S. Drinking Water Needs Improved Risk Management

Baum, R., et.al., Environmental Science & Technology, 50(11):5436-5437, May 2016

This article focuses on the existing public health concerns that the current regulatory system has repeatedly failed to address to protect US residents. Water system failures have been extensively analyzed, leading to a conclusion that most could have been prevented with better risk management. Recent research shows the most reported reasons to be because of time and money constraints, but these may also reflect a lack of policy priority shared by the utility and regulator. U.S. public drinking water systems are focused on meeting nationally defined regulations that target certain maximum contaminant levels (MCLs) and specific treatment techniques.

Assessing clarity of message communication for mandated USEPA drinking water quality reports

Davy, B.M, et.al., Journal of Water and Health, 14(2):223-235, April 2016

The United States Environmental Protection Agency mandates that community water systems (CWSs), or drinking water utilities, provide annual consumer confidence reports (CCRs) reporting on water quality, compliance with regulations, source water, and consumer education. While certain report formats are prescribed, there are no criteria ensuring that consumers understand messages in these reports. To assess clarity of message, trained raters evaluated a national sample of 30 CCRs using the Centers for Disease Control Clear Communication Index (Index) indices: (1) Main Message/Call to Action; (2) Language; (3) Information Design; (4) State of the Science; (5) Behavioral Recommendations; (6) Numbers; and (7) Risk. Communication materials are considered qualifying if they achieve a 90% Index score. Overall mean score across CCRs was 50 ± 14% and none scored 90% or higher. CCRs did not differ significantly by water system size. State of the Science (3 ± 15%) and Behavioral Recommendations (77 ± 36%) indices were the lowest and highest, respectively. Only 63% of CCRs explicitly stated if the water was safe to drink according to federal and state standards and regulations. None of the CCRs had passing Index scores, signaling that CWSs are not effectively communicating with their consumers; thus, the Index can serve as an evaluation tool for CCR effectiveness and a guide to improve water quality communications.

Inactivation Kinetics and Replication Cycle Inhibition of Adenovirus by Monochloramine

Gall, A.M., et.al., Environmental Science & Technology Letters, 3.4:185-189, April 2016

Monochloramine is commonly used as a secondary disinfectant to maintain a residual in drinking water distribution systems in the United States. The mechanism by which waterborne viruses become inactivated by monochloramine remains widely unknown. A more fundamental understanding of how viruses become inactivated is necessary for better detection and control of viruses in drinking water. Human adenovirus (HAdV) is known to be the waterborne virus most resistant to monochloramine disinfection, and this study presents inactivation kinetics over a range of environmental conditions. Several steps in the HAdV replication cycle were investigated to determine which steps become inhibited by monochloramine disinfection. Interestingly, monochloramine-inactivated HAdV could bind to host cells, but genome replication and early and late mRNA transcription were inhibited. We conclude that monochloramine exposure inhibited a replication cycle event after binding but prior to early viral protein synthesis.

Determination of dimethyl selenide and dimethyl sulphide compounds causing off-flavours in bottled mineral waters

Guadayol, M., et. al., Water Research, 92 149-155; April 2016

Sales of bottled drinking water have shown a large growth during the last two decades due to the general belief that this kind of water is healthier, its flavour is better and its consumption risk is lower than that of tap water. Due to the previous points, consumers are more demanding with bottled mineral water, especially when dealing with its organoleptic properties, like taste and odour. This work studies the compounds that can generate obnoxious smells, and that consumers have described like swampy, rotten eggs, sulphurous, cooked vegetable or cabbage. Closed loop stripping analysis (CLSA) has been used as a pre-concentration method for the analysis of off-flavour compounds in water followed by identification and quantification by means of GC-MS. Several bottled water with the aforementioned smells showed the presence of volatile dimethyl selenides and dimethyl sulphides, whose concentrations ranged, respectively, from 4 to 20 ng/L and from 1 to 63 ng/L. The low odour threshold concentrations (OTCs) of both organic selenide and sulphide derivatives prove that several objectionable odours in bottled waters arise from them. Microbial loads inherent to water sources, along with some critical conditions in water processing, could contribute to the formation of these compounds. There are few studies about volatile organic compounds in bottled drinking water and, at the best of our knowledge, this is the first study reporting the presence of dimethyl selenides and dimethyl sulphides causing odour problems in bottled waters.

Multimedia exposures to arsenic and lead for children near an inactive mine tailings and smelter site

Loh, M.M., et.al., Environmental Research, Volume 146, p.331–339, April 2016

Children living near contaminated mining waste areas may have high exposures to metals from the environment. This study investigates whether exposure to arsenic and lead is higher in children in a community near a legacy mine and smelter site in Arizona compared to children in other parts of the United States and the relationship of that exposure to the site. Arsenic and lead were measured in residential soil, house dust, tap water, urine, and toenail samples from 70 children in 34 households up to 7 miles from the site. Soil and house dust were sieved, digested, and analyzed via ICP-MS. Tap water and urine were analyzed without digestion, while toenails were washed, digested and analyzed. Blood lead was analyzed by an independent, certified laboratory. Spearman correlation coefficients were calculated between each environmental media and urine and toenails for arsenic and lead. Geometric mean arsenic (standard deviation) concentrations for each matrix were: 22.1 (2.59) ppm and 12.4 (2.27) ppm for soil and house dust.

Reduction in horizontal transfer of conjugative plasmid by UV irradiation and low-level chlorination

Lin, W., et.al., Water Research, 91:331-338, March 2016

The widespread presence of antibiotic resistance genes (ARGs) and antibiotic resistant bacteria (ARB) in the drinking water system facilitates their horizontal gene transfer among microbiota. In this study, the conjugative gene transfer of RP4 plasmid after disinfection including ultraviolet (UV) irradiation and low-level chlorine treatment was investigated. It was found that both UV irradiation and low-level chlorine treatment reduced the conjugative gene transfer frequency. The transfer frequency gradually decreased from 2.75 × 10(-3) to 2.44 × 10(-5) after exposure to UV doses ranging from 5 to 20 mJ/cm(2). With higher UV dose of 50 and 100 mJ/cm(2), the transfer frequency was reduced to 1.77 × 10(-6) and 2.44 × 10(-8). The RP4 plasmid transfer frequency was not significantly affected by chlorine treatment at dosages ranging from 0.05 to 0.2 mg/l, but treatment with 0.3-0.5 mg/l chlorine induced a decrease in conjugative transfer to 4.40 × 10(-5) or below the detection limit. The mechanisms underlying these phenomena were also explored, and the results demonstrated that UV irradiation and chlorine treatment (0.3 and 0.5 mg/l) significantly reduced the viability of bacteria, thereby lowering the conjugative transfer frequency. Although the lower chlorine concentrations tested (0.05-0.2 mg/l) were not sufficient to damage the cells, exposure to these concentrations may still depress the expression of a flagellar gene (FlgC), an outer membrane porin gene (ompF), and a DNA transport-related gene (TraG). Additionally, fewer pili were scattered on the bacteria after chlorine treatment. These findings are important in assessing and controlling the risk of ARG transfer and dissemination in the drinking water system.

Water Disinfection Byproducts Induce Antibiotic Resistance-Role of Environmental Pollutants in Resistance Phenomena

Li, D., et.al., Environmental Science & Technology, 50(6):3193-3201, March 2016

The spread of antibiotic resistance represents a global threat to public health, and has been traditionally attributed to extensive antibiotic uses in clinical and agricultural applications. As a result, researchers have mostly focused on clinically relevant high-level resistance enriched by antibiotics above the minimal inhibitory concentrations (MICs). Here, we report that two common water disinfection byproducts (chlorite and iodoacetic acid) had antibiotic-like effects that led to the evolution of resistant E. coli strains under both high (near MICs) and low (sub-MIC) exposure concentrations. The subinhibitory concentrations of DBPs selected strains with resistance higher than those evolved under above-MIC exposure concentrations. In addition, whole-genome analysis revealed distinct mutations in small sets of genes known to be involved in multiple drug and drug-specific resistance, as well as in genes not yet identified to play role in antibiotic resistance. The number and identities of genetic mutations were distinct for either the high versus low sub-MIC concentrations exposure scenarios. This study provides evidence and mechanistic insight into the sub-MIC selection of antibiotic resistance by antibiotic-like environmental pollutants such as disinfection byproducts in water, which may be important contributors to the spread of global antibiotic resistance. The results from this study open an intriguing and profound question on the roles of large amount and various environmental contaminants play in selecting and spreading the antibiotics resistance in the environment.

Viral persistence in surface and drinking water: Suitability of PCR pre-treatment with intercalating dyes

Prevost, B., et.al., Water Research, March 2016

After many outbreaks of enteric virus associated with consumption of drinking water, the study of enteric viruses in water has increased significantly in recent years. In order to better understand the dynamics of enteric viruses in environmental water and the associated viral risk, it is necessary to estimate viral persistence in different conditions. In this study, two representative models of human enteric viruses, adenovirus 41 (AdV 41) and coxsackievirus B2 (CV-B2), were used to evaluate the persistence of enteric viruses in environmental water. The persistence of infectious particles, encapsidated genomes and free nucleic acids of AdV 41 and CV-B2 was evaluated in drinking water and surface water at different temperatures (4 °C, 20 °C and 37 °C). The infectivity of AdV 41 and CV-B2 persisted for at least 25 days, whatever the water temperature, and for more than 70 days at 4 °C and 20 °C, in both drinking and surface water. Encapsidated genomes persisted beyond 70 days, whatever the water temperature. Free nucleic acids (i.e. without capsid) also were able to persist for at least 16 days in drinking and surface water. The usefulness of a detection method based on an intercalating dye pre-treatment, which specifically targets preserved particles, was investigated for the discrimination of free and encapsidated genomes and it was compared to virus infectivity. Further, the resistance of AdV 41 and CV-B2 against two major disinfection treatments applied in drinking water plants (UV and chlorination) was evaluated. Even after the application of UV rays and chlorine at high doses (400 mJ/cm(2) and 10 mg.min/L, respectively), viral genomes were still detected with molecular biology methods. Although the intercalating dye pre-treatment had little use for the detection of the effects of UV treatment, it was useful in the case of treatment by chlorination and less than 1 log10 difference in the results was found as compared to the infectivity measurements. Finally, for the first time, the suitability of intercalating dye pre-treatment for the estimation of the quality of the water produced by treatment plants was demonstrated using samples from four drinking-water plants and two rivers. Although 55% (27/49) of drinking water samples were positive for enteric viruses using molecular detection, none of the samples were positive when the intercalating dye pre-treatment method was used. This could indicate that the viruses that were detected are not infectious.

Using flow cytometry and Bacteroidales 16S rRNA markers to study the hygienic quality of source water

Baumgartner, A., et.al., Journal für Verbraucherschutz und Lebensmittelsicherheit, 11.1:83-88, March 2016

Six source water fountains in the community of Berne, Switzerland were sampled monthly over the period of 1 year. The samples were tested for total counts by flow cytometry, and for fecal contamination by using the Bacteroidales 16S rRNA markers HF183, BacR and AllBac. The total counts varied considerably between the different fountains with a minimal value of 5115 counts/L and with a maximal count of 198,508 counts/L. The long-term patterns of total counts over 1 year were typical for each fountain. Comparison of rainfall data and data for the non-specific fecal marker AllBac was shown to be a suitable approach to highlight the vulnerability of sources to environmental influences. HF183, indicating contamination of human origin, occurred only sporadically and in insignificant amounts. Furthermore, as indicated by BacR, the studied fountains showed no evidence of contamination by ruminant feces. Further work is suggested in order to establish threshold values for molecular Bacteroidales markers, which could in future replace the currently used criteria for fecal indicator bacteria.

Contrasting regional and national mechanisms for predicting elevated arsenic in private wells across the United States using classification and regression trees

Frederick, L., et. al., Water Research, March 2016

Arsenic contamination in groundwater is a public health and environmental concern in the United States (U.S.) particularly where monitoring is not required under the Safe Water Drinking Act. Previous studies suggest the influence of regional mechanisms for arsenic mobilization into groundwater; however, no study has examined how influencing parameters change at a continental scale spanning multiple regions. We herein examine covariates for groundwater in the western, central and eastern U.S. regions representing mechanisms associated with arsenic concentrations exceeding the U.S. Environmental Protection Agency maximum contamination level (MCL) of 10 parts per billion (ppb). Statistically significant covariates were identified via classification and regression tree (CART) analysis, and included hydrometeorological and groundwater chemical parameters. The CART analyses were performed at two scales: national and regional; for which three physiographic regions located in the western (Payette Section and the Snake River Plain), central (Osage Plains of the Central Lowlands), and eastern (Embayed Section of the Coastal Plains) U.S. were examined. Validity of each of the three regional CART models was indicated by values >85% for the area under the receiver-operating characteristic curve. Aridity (precipitation minus potential evapotranspiration) was identified as the primary covariate associated with elevated arsenic at the national scale. At the regional scale, aridity and pH were the major covariates in the arid to semi-arid (western) region; whereas dissolved iron (taken to represent chemically reducing conditions) and pH were major covariates in the temperate (eastern) region, although additional important covariates emerged, including elevated phosphate. Analysis in the central U.S. region indicated that elevated arsenic concentrations were driven by a mixture of those observed in the western and eastern regions.

Elevated Blood Lead Levels in Children Associated With the Flint Drinking Water Crisis: A Spatial Analysis of Risk and Public Health Response

Hanna-Attisha, M., et. al., American Journal of Public Health, Vol. 106 no. 2, February 2016

We analyzed differences in pediatric elevated blood lead level incidence before and after Flint, Michigan, introduced a more corrosive water source into an aging water system without adequate corrosion control. We reviewed blood lead levels for children younger than 5 years before (2013) and after (2015) water source change in Greater Flint, Michigan. We assessed the percentage of elevated blood lead levels in both time periods, and identified geographical locations through spatial analysis. Incidence of elevated blood lead levels increased from 2.4% to 4.9% ( P< .05) after water source change, and neighborhoods with the highest water lead levels experienced a 6.6% increase. No significant change was seen outside the city. Geospatial analysis identified disadvantaged neighborhoods as having the greatest elevated blood lead level increases and informed response prioritization during the now-declared public health emergency. It was concluded that the percentage of children with elevated blood lead levels increased after water source change, particularly in socioeconomically disadvantaged neighborhoods. Water is a growing source of childhood lead exposure because of aging infrastructure.

DVC-FISH and PMA-qPCR techniques to assess the survival of Helicobacter pylori inside Acanthamoeba castellanii

Moreno-Mesonoro, L., et.al., Research in Microbiology, 167(1):29-34, January 2016

Free-living amoebae (FLA) are ubiquitous microorganisms commonly found in water. They can act as Trojan Horses for some amoeba-resistant bacteria (ARB). Helicobacter pylori is a pathogenic bacteria, suggested to be transmitted through water, which could belong to the ARB group. In this work, a co-culture assay of H. pylori and Acanthamoeba castellanii, one of the most common FLA, was carried out to identify the presence and survival of viable and potentially infective forms of the bacteria internalized by the amoeba. Molecular techniques including FISH, DVC-FISH, qPCR and PMA-qPCR were used to detect the presence of internalized and viable H. pylori. After 24 h in co-culture and disinfection treatment to kill extra-amoebic bacteria, viable H. pylori cells were observed inside A. castellanii. When PMA-qPCR was applied to the co-culture samples, only DNA from internalized H. pylori cells was detected, whereas qPCR amplified total DNA from the sample. By the combined DVC-FISH method, the viability of H. pylori cells in A. castellanii was observed. Both specific techniques provided evidence, for the first time, that the pathogen is able to survive chlorination treatment in occurrence with A. castellanii and could be very useful methods for performing further studies in environmental samples.

Variability in the chemistry of private drinking water supplies and the impact of domestic treatment systems on water quality

Ander, E.L., et.al., Environmental Geochemistry and Health, 38(6):1313-1332, January 2016

Tap water from 497 properties using private water supplies, in an area of metalliferous and arsenic mineralisation (Cornwall, UK), was measured to assess the extent of compliance with chemical drinking water quality standards, and how this is influenced by householder water treatment decisions. The proportion of analyses exceeding water quality standards were high, with 65 % of tap water samples exceeding one or more chemical standards. The highest exceedances for health-based standards were nitrate (11 %) and arsenic (5 %). Arsenic had a maximum observed concentration of 440 µg/L. Exceedances were also high for pH (47 %), manganese (12 %) and aluminium (7 %), for which standards are set primarily on aesthetic grounds. However, the highest observed concentrations of manganese and aluminium also exceeded relevant health-based guidelines. Significant reductions in concentrations of aluminium, cadmium, copper, lead and/or nickel were found in tap waters where households were successfully treating low-pH groundwaters, and similar adventitious results were found for arsenic and nickel where treatment was installed for iron and/or manganese removal, and successful treatment specifically to decrease tap water arsenic concentrations was observed at two properties where it was installed. However, 31 % of samples where pH treatment was reported had pH < 6.5 (the minimum value in the drinking water regulations), suggesting widespread problems with system maintenance. Other examples of ineffectual treatment are seen in failed responses post-treatment, including for nitrate. This demonstrates that even where the tap waters are considered to be treated, they may still fail one or more drinking water quality standards. We find that the degree of drinking water standard exceedances warrant further work to understand environmental controls and the location of high concentrations. We also found that residents were more willing to accept drinking water with high metal (iron and manganese) concentrations than international guidelines assume. These findings point to the need for regulators to reinforce the guidance on drinking water quality standards to private water supply users, and the benefits to long-term health of complying with these, even in areas where treated mains water is widely available.

Human exposure to thallium through tap water: A study from Valdicastello Carducci and Pietrasanta (northern Tuscany, Italy)

Campanella, B., et. al., Science of the Total Environment, January 2016

A geological study evidenced the presence of thallium (Tl) at concentrations of concern in groundwaters near Valdicastello Carducci (Tuscany, Italy). The source of contamination has been identified in the Tl-bearing pyrite ores occurring in the abandoned mining sites of the area. The strongly acidic internal waters flowing in the mining tunnels can reach exceptional Tl concentrations, up to 9000μg/L. In September 2014 Tl contamination was also found in the tap water distributed in the same area (from 2 to 10μg/L). On October 3, 2014 the local authorities imposed a Do Not Drink order to the population. Here we report the results of the exposure study carried out from October 2014 to October 2015, and aimed at quantifying Tl levels in 150 urine and 318 hair samples from the population of Valdicastello Carducci and Pietrasanta. Thallium was quantified by inductively coupled plasma – mass spectrometry (ICP-MS). Urine and hair were chosen as model matrices indicative of different time periods of exposure (short-term and long-term, respectively). Thallium values found in biological samples were correlated with Tl concentrations found in tap water in the living area of each citizen, and with his/her habits. Thallium concentration range found in hair and urine was 1-498ng/g (values in unexposed subjects 0.1-6ng/g) and 0.046-5.44μg/L (reference value for the European population 0.006μg/L), respectively. Results show that Tl levels in biological samples were significantly associated with residency in zones containing elevated water Tl levels. The kinetics of decay of Tl concentration in urine samples was also investigated. At the best of our knowledge, this is the first study on human contamination by Tl through water involving such a high number of samples.

Prevalence and characterization of extended-spectrum beta-lactamase-producing Enterobacteriaceae in spring waters

Li, S., et.al., Letters in Applied Microbiology, 61(6):544-548, December 2015

The purpose of this study was to investigate the prevalence and characterization of extended-spectrum beta-lactamases (ESBL)-producing Enterobacteriaceae from spring waters in Mountain Tai of China. ESBL-producing Enterobacteriaceae were found in four out of 50 sampled spring waters (4/50, 8·0%) and a total of 16 non-duplicate ESBL-producing Enterobacteriaceae were obtained, including 13 Escherichia coli (E. coli) and three Klebsiella pneumoniae (Kl. pneumoniae). All 16 nonduplicate ESBL-producing Enterobacteriaceae isolates harboured genes encoding CTX-M ESBLs, among which six expressed CTX-M-15, five produced CTX-M-14, three produced CTX-M-55 and two expressed CTX-M-27. Four multilocus sequence types (ST) were found and ST131 was the dominant type (8/16, 50·0%). Taken together, the contamination of ESBL-producing Enterobacteriaceae were present in spring waters of Mountain Tai. The results indicated that spring waters could become a reservoir of antibiotic resistant bacteria and contribute to the spread of antimicrobial-resistant bacteria via drinking water or food chain. In addition, wastewater discharge of restaurants or hotels may be an important contribution source of antibiotic resistant bacteria in spring waters.

Qualitative analysis of water quality deterioration and infection by Helicobacter pylori in a community with high risk of stomach cancer (Cauca, Colombia)

Acosta, C.P., et.al., Salud Colectiva, 11(4):575-590, December 2015

This study looks at aspects of the environmental health of the rural population in Timbío (Cauca, Columbia) in relation to the deterioration of water quality. The information was obtained through participatory research methods exploring the management and use of water, the sources of pollution and the perception of water quality and its relation to Helicobacter pylori infection. The results are part of the qualitative analysis of a first research phase characterizing water and sanitation problems and their relation to emerging infectious diseases as well as possible solutions, which was carried out between November 2013 and August 2014. The results of this research are discussed from an ecosystemic approach to human health, recognizing the complexity of environmental conflicts related to water resources and their impacts on the health of populations. Through the methodology used, it is possible to detect and visualize the most urgent problems as well as frequent causes of contamination of water resources so as to propose solutions within a joint agenda of multiple social actors.

Evaluation of alternative DNA extraction processes and real-time PCR for detecting Cryptosporidium parvum in drinking water

Kimble, G.H., Water Science and Technology: Water Supply, 15.6:1295-1303, December 2015

USEPA Method 1623 is the standard method in the United States for the detection of Cryptosporidium in water samples, but quantitative real-time polymerase chain reaction (qPCR) is an alternative technique that has been successfully used to detect Cryptosporidium in aqueous matrices. This study examined various modifications to a commercial nucleic acid extraction procedure in order to enhance PCR detection sensitivity for Cryptosporidium. An alternative DNA extraction buffer allowed for qPCR detection at lower seed levels than a commercial extraction kit buffer. In addition, the use of a second spin column cycle produced significantly better detection (P = 0.031), and the volume of Tris–EDTA buffer significantly affected crossing threshold values (P= 0.001). The improved extraction procedure was evaluated using 10 L of tap water samples processed by ultrafiltration, centrifugation and immunomagnetic separation. Mean recovery for the sample processing method was determined to be 41% using microscopy and 49% by real-time PCR (P = 0.013). The results of this study demonstrate that real-time PCR can be an effective alternative for detecting and quantifying Cryptosporidium parvum in drinking water samples.

Potential applications of next generation DNA sequencing of 16S rRNA gene amplicons in microbial water quality monitoring

Vierheilig, J., et.al., Water Science and Technology, 72.11:1962-1972, December 2015

The applicability of next generation DNA sequencing (NGS) methods for water quality assessment has so far not been broadly investigated. This study set out to evaluate the potential of an NGS-based approach in a complex catchment with importance for drinking water abstraction. In this multi-compartment investigation, total bacterial communities in water, faeces, soil, and sediment samples were investigated by 454 pyrosequencing of bacterial 16S rRNA gene amplicons to assess the capabilities of this NGS method for (i) the development and evaluation of environmental molecular diagnostics, (ii) direct screening of the bulk bacterial communities, and (iii) the detection of faecal pollution in water. Results indicate that NGS methods can highlight potential target populations for diagnostics and will prove useful for the evaluation of existing and the development of novel DNA-based detection methods in the field of water microbiology. The used approach allowed unveiling of dominant bacterial populations but failed to detect populations with low abundances such as faecal indicators in surface waters. In combination with metadata, NGS data will also allow the identification of drivers of bacterial community composition during water treatment and distribution, highlighting the power of this approach for monitoring of bacterial regrowth and contamination in technical systems.

Qualitative analysis of water quality deterioration and infection by Helicobacter pylori in a community with high risk of stomach cancer (Cauca, Colombia)

Acosta, C.P., et. al., Salud Colectiva, 11 (4):575-590, December 2015

This study looks at aspects of the environmental health of the rural population in Timbio (Cauca, Columbia) in relation to the deterioration of water quality. The information was obtained through participatory research methods exploring the management and use of water, the sources of pollution and the perception of water quality and its relation to Helicobacter pylori infection. The results are part of the qualitative analysis of a first research phase characterizing water and sanitation problems and their relation to emerging infectious diseases as well as possible solutions, which was carried out between November 2013 and August 2014. The results of this research are discussed from an ecosystemic approach to human health, recognizing the complexity of environmental conflicts related to water resources and their impacts on the health of populations. Through the methodology used, it is possible to detect and visualize the most urgent problems as well as frequent causes of contamination of water resources so as to propose solutions within a joint agenda of multiple social actors.

Impacts of hydraulic fracturing on water quality: a review of literature, regulatory frameworks and an analysis of information gaps

Gagnon, G.A., et.al., Environmental Reviews, 24(2):122-131, November 2015

A review of available literature and current governance approaches related to the potential impacts of hydraulic fracturing on water quality (including drinking water) was developed. The paper identifies gaps in literature and (or) current governance approaches that should be addressed to guide decision-makers in the development of appropriate regulatory regimes that will enable assessment of the impacts of hydraulic fracturing on water quality. The lack of credible and comprehensive data are shown to have been a major setback to properly investigate and monitor hydraulic fracturing activities and their potential risks on the environment and water quality. A review of current governance approaches demonstrates that some jurisdictions have implemented baseline and post-operation water quality monitoring requirements; however, there are large variations in site-specific monitoring requirements across Canada and the United States. In light of recent information, a targeted approach is suggested based on risk priorities, which can prioritize sample collection and frequency, target contaminants, and the needed duration of the sampling. The steps outlined in this review help to interface with the public concerns associated with water quality, and appropriately ensure that public health is protected through appropriate water safety planning.

Estimating Potential Increased Bladder Cancer Risk Due to Increased Bromide Concentrations in Sources of Disinfected Drinking Waters

Regli, S., et.al., Environmental Science & Technology, 49.22:13094-13102, November 2015

Public water systems are increasingly facing higher bromide levels in their source waters from anthropogenic contamination through coal-fired power plants, conventional oil and gas extraction, textile mills, and hydraulic fracturing. Climate change is likely to exacerbate this in coming years. We estimate bladder cancer risk from potential increased bromide levels in source waters of disinfecting public drinking water systems in the United States. Bladder cancer is the health end point used by the United States Environmental Protection Agency (EPA) in its benefits analysis for regulating disinfection byproducts in drinking water. We use estimated increases in the mass of the four regulated trihalomethanes (THM4) concentrations (due to increased bromide incorporation) as the surrogate disinfection byproduct (DBP) occurrence metric for informing potential bladder cancer risk. We estimate potential increased excess lifetime bladder cancer risk as a function of increased source water bromide levels. Results based on data from 201 drinking water treatment plants indicate that a bromide increase of 50 μg/L could result in a potential increase of between 10(-3) and 10(-4) excess lifetime bladder cancer risk in populations served by roughly 90% of these plants.

Solar Disinfection of Viruses in Polyethylene Terephthalate Bottles

Carratala, A., et.al., Applied and Environmental Microbiology, 82(1):279-288, October 2015

Solar disinfection (SODIS) of drinking water in polyethylene terephthalate (PET) bottles is a simple, efficient point-of-use technique for the inactivation of many bacterial pathogens. In contrast, the efficiency of SODIS against viruses is not well known. In this work, we studied the inactivation of bacteriophages (MS2 and ϕX174) and human viruses (echovirus 11 and adenovirus type 2) by SODIS. We conducted experiments in PET bottles exposed to (simulated) sunlight at different temperatures (15, 22, 26, and 40°C) and in water sources of diverse compositions and origins (India and Switzerland). Good inactivation of MS2 (>6-log inactivation after exposure to a total fluence of 1.34 kJ/cm(2)) was achieved in Swiss tap water at 22°C, while less-efficient inactivation was observed in Indian waters and for echovirus (1.5-log inactivation at the same fluence). The DNA viruses studied, ϕX174 and adenovirus, were resistant to SODIS, and the inactivation observed was equivalent to that occurring in the dark. High temperatures enhanced MS2 inactivation substantially; at 40°C, 3-log inactivation was achieved in Swiss tap water after exposure to a fluence of only 0.18 kJ/cm(2). Overall, our findings demonstrate that SODIS may reduce the load of single-stranded RNA (ssRNA) viruses, such as echoviruses, particularly at high temperatures and in photoreactive matrices. In contrast, complementary measures may be needed to ensure efficient inactivation during SODIS of DNA viruses resistant to oxidation.

Regulation of non-relevant metabolites of plant protection products in drinking and groundwater in the EU: Current status and way forward

Laabs, V., et.al., Regulatory Toxicology and Pharmacology, 73.1:276-286, October 2015

Non-relevant metabolites are defined in the EU regulation for plant protection product authorization and a detailed definition of non-relevant metabolites is given in an EU Commission DG Sanco (now DG SANTE – Health and Food Safety) guidance document. However, in water legislation at EU and member state level non-relevant metabolites of pesticides are either not specifically regulated or diverse threshold values are applied. Based on their inherent properties, non-relevant metabolites should be regulated based on substance-specific and toxicity-based limit values in drinking and groundwater like other anthropogenic chemicals. Yet, if a general limit value for non-relevant metabolites in drinking and groundwater is favored, an application of a Threshold of Toxicological Concern (TTC) concept for Cramer class III compounds leads to a threshold value of 4.5 μg L(-1). This general value is exemplarily shown to be protective for non-relevant metabolites, based on individual drinking water limit values derived for a set of 56 non-relevant metabolites. A consistent definition of non-relevant metabolites of plant protection products, as well as their uniform regulation in drinking and groundwater in the EU, is important to achieve legal clarity for all stakeholders and to establish planning security for development of plant protection products for the European market.

Incidence of waterborne lead in private drinking water systems in Virginia

Pieper, K.J., et.al, Journal of Water and Health, 13(3):897-908, September 2015

Although recent studies suggest contamination by bacteria and nitrate in private drinking water systems is of increasing concern, data describing contaminants associated with the corrosion of onsite plumbing are scarce. This study reports on the analysis of 2,146 samples submitted by private system homeowners. Almost 20% of first draw samples submitted contained lead concentrations above the United States Environmental Protection Agency action level of 15 μg/L, suggesting that corrosion may be a significant public health problem. Correlations between lead, copper, and zinc suggested brass components as a likely lead source, and dug/bored wells had significantly higher lead concentrations as compared to drilled wells. A random subset of samples selected to quantify particulate lead indicated that, on average, 47% of lead in the first draws was in the particulate form, although the occurrence was highly variable. While flushing the tap reduced lead below 15 μg/L for most systems, some systems experienced an increase, perhaps attributable to particulate lead or lead-bearing components upstream of the faucet (e.g., valves, pumps). Results suggest that without including a focus on private as well as municipal systems it will be very difficult to meet the existing national public health goal to eliminate elevated blood lead levels in children.

Presence of antibiotic resistant bacteria and antibiotic resistance genes in raw source water and treated drinking water

Bergeron, S., et.al., International Biodeterioration & Biodegredation, 102:370-374August 2015

Antibiotic resistance is becoming a very large problem throughout the world. The spread of antibiotic resistant bacteria (ARB) and antibiotic resistance genes (ARGs) in the environment is a major public health issue. Aquatic ecosystem is a significant source for ARB and ARGs. The drinking water treatment system is designed specifically to eliminate bacteria and pathogens in drinking water. The presence of ARB and ARGs in source water and drinking water may affect public health and it is an emerging issue in drinking water industry. Therefore, this study was conducted to study the presence of ARB and ARGs in a source water, treated drinking water (finished water), and in the distribution line (tap water) in a rural water treatment plant in Louisiana. The results showed the presence of several ARB in the source water including, Enterobacter cloacae, Klebsiella pneumoniae, Escherichia coli, Pseudomonas, Enterococcus, Staphylococcus and Bacillus spp. However, the water treatment plant effectively removed these bacteria in the treated water as none of these bacteria were found in the tap water as well as in the finished water at the water treatment plant. Bacterial DNA including 16s rRNA and ARGs of sulfonamides and tetracycline antibiotics were observed in raw water. The presence of 16s rRNA was found consistently in every month of sampling in raw water, finished water, and tap water. This suggests that the filtration system at the treatment plant was ineffective in filtering out small fragments of bacterial DNA. Also, the possibility of the presence of biofilms in the water pipeline exists, which may develop antibiotic resistance due to the selective pressure of chlorination in drinking water.

Surveillance of perchlorate in ground water, surface water and bottled water in Kerala, India

Nadaraja, A.V., et.al., Journal of Environmental Health Science and Engineering, July 2015

Perchlorate is an emerging water contaminant that disrupts normal functioning of human thyroid gland and poses serious threat to health, especially for pregnant women, fetus and children. High level of perchlorate contamination in fresh water sources at places nearby ammonium perchlorate (rocket fuel) handled in bulk is reported in this study. Of 160 ground water samples analyzed from 27 locations in the State Kerala, 58 % had perchlorate above detection limit (2 μg/L) and the highest concentration observed was 7270 μg/L at Ernakulam district, this value is ~480 times higher than USEPA drinking water equivalent level (15 μg/L). Perchlorate was detected in all surface water samples analyzed (n = 10) and the highest value observed was 355 μg/L in Periyar river (a major river in the State). The bottled drinking water (n = 5) tested were free of perchlorate. The present study underlines the need for frequent screening of water sources for perchlorate contamination around places the chemical is handled in bulk. It will help to avoid human exposure to high levels of perchlorate.

Waterborne outbreaks in the Nordic countries, 1998 to 2012

Guzman-Herrador, B., et.al., Eurosurveillance, 20.24, June 2015

A total of 175 waterborne outbreaks affecting 85,995 individuals were notified to the national outbreak surveillance systems in Denmark, Finland and Norway from 1998 to 2012, and in Sweden from 1998 to 2011. Between 4 and 18 outbreaks were reported each year during this period. Outbreaks occurred throughout the countries in all seasons, but were most common (n = 75/169, 44%) between June and August. Viruses belonging to the Caliciviridae family and Campylobacter were the pathogens most frequently involved, comprising n = 51 (41%) and n = 36 (29%) of all 123 outbreaks with known aetiology respectively. Although only a few outbreaks were caused by parasites (Giardia and/or Cryptosporidium), they accounted for the largest outbreaks reported during the study period, affecting up to 53,000 persons. Most outbreaks, 124 (76%) of those with a known water source (n = 163) were linked to groundwater. A large proportion of the outbreaks (n = 130/170, 76%) affected a small number of people (less than 100 per outbreak) and were linked to single-household water supplies. However, in 11 (6%) of the outbreaks, more than 1,000 people became ill. Although outbreaks of this size are rare, they highlight the need for increased awareness, particularly of parasites, correct water treatment regimens, and vigilant management and maintenance of the water supply and distribution systems.

Microbial Health Risks of Regulated Drinking Waters in the United States — A Comparative Microbial Safety Assessment of Public Water Supplies and Bottled Water

Edberg, S.C., Topics in Public Health, June 2015

The quality of drinking water in the United States (U.S.) is extensively monitored and regulated by federal, state and local agencies, yet there is increasing public concern and confusion about the safety and quality of drinking water –– both from public water systems and from bottled water products. In the U.S., tap water and bottled water are regulated by two different agencies: the Environmental Protection Agency (EPA) regulates public water system water (tap water) and the Food and Drug Administration (FDA) regulates bottled water. Federal law requires that the FDA’s regulations for bottled water must be at least as protective of public health as EPA standards for tap water.

Performance Evaluation of an Italian Reference Method, the ISO Reference Method and a Chromogenic Rapid Method for the Detection of E. coli and Coliforms in Bottled Water

Di Pasquale, S. and Dario De Medici, Food Analytical Methods, 8.10:2417-2426, April 2015

Bottled water can be contaminated by coliforms and/or Escherichia coli (E. coli). These bacteria are considered as indicators of faecal pollution, and their detection in bottled water indicates the potential contamination by pathogenic enteric microorganisms. In recent decades, different methods were developed for the detection of coliforms and E. coli in drinking water and in bottled water including mineral water. Since 1976, the Italian regulation has defined microbiological methods to evaluate microbiological characteristics of mineral waters. Three different methods for the detection of coliforms and E. coli in bottled water were compared in this study: the Italian reference method, according to the “Italian Ministerial Rule,” the ISO 9308–1:2002 method, and a new rapid method. The results have demonstrated that the ISO method 9308–1:2002 and the new rapid method are as sensitive and specific as Italian reference method, and that both could be used to evaluate the contamination level of coliform and E. coli in drinking water and in bottled water including mineral water.

Microbial diversity and dynamics of a groundwater and a still bottled natural mineral water

Franca, L., et.al., Environmental Microbiology, 17.3:577-593, March 2015

The microbial abundance and diversity at source, after bottling and through 6 months of storage of a commercial still natural mineral water were assessed by culture-dependent and culture-independent methods. The results revealed clear shifts of the dominant communities present in the three different stages. The borehole waters displayed low cell densities that increased 1.5-fold upon bottling and storage, reaching a maximum (6.2 × 108 cells l−1) within 15 days after bottling, but experienced a significant decrease in diversity. In all cases, communities were largely dominated by Bacteria. The culturable heterotrophic community was characterized by recovering 3626 isolates, which were primarily affiliated with the Alphaproteobacteria, Betaproteobacteria and Gammaproteobacteria. This study indicates that bottling and storage induce quantitative and qualitative changes in the microbial assemblages that seem to be similar as revealed by the two sample batches collected on 2 consecutive years. To our knowledge, this is the first study combining culture-independent with culture-dependent methods, and repeated tests to reveal the microbial dynamics occurring from source to stored bottled water.

Molecular detection of Helicobacter pylori in a large Mediterranean river, by direct viable count fluorescent in situ hybridization (DVC-FISH

Tirodimos, L., et.al., Journal of Water and Health, 12.4:868-873, December 2014

Although the precise route and mode of transmission of Helicobacter pylori are still unclear, molecular methods have been applied for the detection of H. pylori in environmental samples. In this study, we used the direct viable count fluorescent in situ hybridization (DVC-FISH) method to detect viable cells of H. pylori in the River Aliakmon, Greece. This is the longest river in Greece, and provides potable water in metropolitan areas. H. pylori showed positive detection for 23 out of 48 water samples (47.9%), while no seasonal variation was found and no correlation was observed between the presence of H. pylori and indicators of fecal contamination. Our findings strengthen the evidence that H. pylori is waterborne while its presence adds to the potential health hazards of the River Aliakmon.

Chromium in drinking water: association with biomarkers of exposure and effect

Sazakli, E., et.al., International Journal of Environmental Research and Public Health, 11.10:10125-10145, October 2014

An epidemiological cross-sectional study was conducted in Greece to investigate health outcomes associated with long-term exposure to chromium via drinking water. The study population consisted of 304 participants. Socio-demographics, lifestyle, drinking water intake, dietary habits, occupational and medical history data were recorded through a personal interview. Physical examination and a motor test were carried out on the individuals. Total chromium concentrations were measured in blood and hair of the study subjects. Hematological, biochemical and inflammatory parameters were determined in blood. Chromium in drinking water ranged from <0.5 to 90 μg·L-1 in all samples but one (220 μg·L-1), with a median concentration of 21.2 μg·L-1. Chromium levels in blood (median 0.32 μg·L-1, range <0.18-0.92 μg·L-1) and hair (median 0.22 μg·g-1, range 0.03-1.26 μg·g-1) were found within “normal range” according to the literature. Personal lifetime chromium exposure dose via drinking water, calculated from the results of the water analyses and the questionnaire data, showed associations with blood and hair chromium levels and certain hematological and biochemical parameters. Groups of subjects whose hematological or biochemical parameters were outside the normal range were not correlated with chromium exposure dose, except for groups of subjects with high triglycerides or low sodium. Motor impairment score was not associated with exposure to chromium.

Naegleria fowleri: An emerging drinking water pathogen

Bartrand, T., et.al., American Water Works Association Journal, 106.10:418-432, October 2014

Naegleria fowleri (N. fowleri) is a free-living, trophic amoeba that is nearly ubiquitous in the environment and can be present in high numbers in warm waters. It is the causative agent of primary amoebic meningoencephalitis (PAM), a rare but particularly lethal disease with a very low survival incidence. Although N. fowleri was isolated from drinking water supplies in Australia in the 1980s, it was not considered a drinking water threat in the United States until recent cases were associated with a groundwater system in Arizona and surface water systems in Louisiana. N. fowleri in drinking water treatment and distribution systems can be managed using disinfectant concentrations typically encountered in well-run plants although nitrification and attendant low disinfectant residuals may pose a challenge for some systems. The greatest challenge for N. fowleri control is in premise plumbing systems where conditions are largely outside the control of utilities, residuals might be low or nonexistent, and where water temperatures could be high enough to support rapid growth of the amoebae. This article reviews published studies describing the environmental occurrence, survival, pathogenicity, and disinfection of N. fowleri. In addition, this article provides information about this little known and poorly understood parasite with respect to its occurrence in the environment; how the amoeba amplifies in water systems such that it can cause infection; how N. fowleri has been successfully controlled for decades in water systems through treatment and distribution system management in Australia; and the knowledge gaps and information needed to address N. fowleri as an emerging pathogen in US water supplies.

Emerging Trends in Groundwater Pollution and Quality

Kurwadkar, S., Water Environment Research, 86.10:1677-1691, October 2014

Groundwater pollution due to anthropogenic activities may impact overall groundwater quality. Organic and inorganic pollutants have been routinely detected at unsafe levels in groundwater rendering this important drinking water resource practically unusable. Vulnerability of groundwater pollution and subsequent impact has been documented in various studies across the globe. Field studies as well as mathematical models have demonstrated increasing levels of pollutants in both shallow and deep aquifer systems. New emerging pollutants such as organic micro-pollutants have also been detected in some industrialized as well as in developing countries. Increased vulnerability coupled with ever growing demand for groundwater may pose a greater threat of pollution due to induced recharge and lack of environmental safeguards to protect groundwater sources. In this review paper, comprehensive assessment of groundwater quality impact due to human activities such as improper management of organic and inorganic waste, and natural sources is documented. A detailed review of published reports and peer reviewed journal papers across the world clearly demonstrate that groundwater quality is declining over time. A proactive approach is needed to prevent human health and ecological consequences due to ingestion of contaminated groundwater.

Emerging Trends in Groundwater Pollution and Quality

Kurwadkar, S., Water Environment Research, pp. 1677-1691(15), October 2014

Groundwater pollution due to anthropogenic activities may impact overall groundwater quality. Organic and inorganic pollutants have been routinely detected at unsafe levels in groundwater rendering this important drinking water resource practically unusable. Vulnerability of groundwater pollution and subsequent impact has been documented in various studies across the globe. Field studies as well as mathematical models have demonstrated increasing levels of pollutants in both shallow and deep aquifer systems. New emerging pollutants such as organic micro-pollutants have also been detected in some industrialized as well as in developing countries. Increased vulnerability coupled with ever growing demand for groundwater may pose a greater threat of pollution due to induced recharge and lack of environmental safeguards to protect groundwater sources. In this review paper, comprehensive assessment of groundwater quality impact due to human activities such as improper management of organic and inorganic waste, and natural sources is documented. A detailed review of published reports and peer reviewed journal papers across the world clearly demonstrate that groundwater quality is declining over time. A proactive approach is needed to prevent human health and ecological consequences due to ingestion of contaminated groundwater.

Evaluation of long-term (1960-2010) groundwater fluoride contamination in Texas

Chaudhuri, S., and Srinivasulu Ale, Journal of Environmental Quality, 43.4:1404-1416, August 2014

Groundwater quality degradation is a major threat to sustainable development in Texas. The aim of this study was to elucidate spatiotemporal patterns of groundwater fluoride (F) contamination in different water use classes in 16 groundwater management areas in Texas between 1960 and 2010. Groundwater F concentration data were obtained from the Texas Water Development Board and aggregated over a decadal scale. Our results indicate that observations exceeding the drinking water quality threshold of World Health Organization (1.5 mg F L) and secondary maximum contaminant level (SMCL) (2 mg F L) of the USEPA increased from 26 and 19% in the 1960s to 37 and 23%, respectively, in the 2000s. In the 2000s, F observations > SMCL among different water use classes followed the order: irrigation (39%) > domestic (20%) > public supply (17%). Extent and mode of interaction between F and other water quality parameters varied regionally. In western Texas, high F concentrations were prevalent at shallower depths (<50 m) and were positively correlated with bicarbonate (HCO) and sulfate anions. In contrast, in southern and southeastern Texas, higher F concentrations occurred at greater depths (>50 m) and were correlated with HCO and chloride anions. A spatial pattern has become apparent marked by “excess” F in western Texas groundwaters as compared with “inadequate” F contents in rest of the state. Groundwater F contamination in western Texas was largely influenced by groundwater mixing and evaporative enrichment as compared with water-rock interaction and mineral dissolution in the rest of the state.

Ground water contamination with (238)U, (234)U, (235)U, (226)Ra and (210)Pb from past uranium mining: cove wash, Arizona

da Cunha, K.M.D., et.al., Environmental Geochemistry and Health, 36.3:477-487, June 2014

The objectives of the study are to present a critical review of the (238)U, (234)U, (235)U, (226)Ra and (210)Pb levels in water samples from the EPA studies (U.S. EPA in Abandoned uranium mines and the Navajo Nation: Red Valley chapter screening assessment report. Region 9 Superfund Program, San Francisco, 2004, Abandoned uranium mines and the Navajo Nation: Northern aum region screening assessment report. Region 9 Superfund Program, San Francisco, 2006, Health and environmental impacts of uranium contamination, 5-year plan. Region 9 Superfund Program, San Franciso, 2008) and the dose assessment for the population due to ingestion of water containing (238)U and (234)U. The water quality data were taken from Sect. “Data analysis” of the published report, titled Abandoned Uranium Mines Project Arizona, New Mexico, Utah-Navajo Lands 1994-2000, Project Atlas. Total uranium concentration was above the maximum concentration level for drinking water (7.410-1 Bq/L) in 19 % of the water samples, while (238)U and (234)U concentrations were above in 14 and 17 % of the water samples, respectively. (226)Ra and (210)Pb concentrations in water samples were in the range of 3.7 × 10(-1) to 5.55 × 102 Bq/L and 1.11 to 4.33 × 102 Bq/L, respectively. For only two samples, the (226)Ra concentrations exceeded the MCL for total Ra for drinking water (0.185 Bq/L). However, the (210)Pb/(226)Ra ratios varied from 0.11 to 47.00, and ratios above 1.00 were observed in 71 % of the samples. Secular equilibrium of the natural uranium series was not observed in the data record for most of the water samples. Moreover, the (235)U/(total)U mass ratios ranged from 0.06 to 5.9 %, and the natural mass ratio of (235)U to (total)U (0.72 %) was observed in only 16 % of the water samples, ratios above or below the natural ratio could not be explained based on data reported by U.S. EPA. In addition, statistical evaluations showed no correlations among the distribution of the radionuclide concentrations in the majority of the water samples, indicating more than one source of contamination could contribute to the sampled sources. The effective doses due to ingestion of the minimum uranium concentrations in water samples exceed the average dose considering inhalation and ingestion of regular diet for other populations around the world (1 μSv/year). The maximum doses due to ingestion of (238)U or (234)U were above the international limit for effective dose for members of the public (1 mSv/year), except for inhabitants of two chapters. The highest effective dose was estimated for inhabitants of Cove, and it was almost 20 times the international limit for members of the public. These results indicate that ingestion of water from some of the sampled sources poses health risks.

Contamination of Groundwater Systems in the US and Canada by Enteric Pathogens, 1990–2013: A Review and Pooled-Analysis

Hynds, P.D., Thomas, M.K., Pintar, K.D.M., PLOS ONE,Vol. 9, issue 5, e93301, May 2014

A combined review and pooled-analysis approach was used to investigate groundwater contamination in Canada and the US from 1990 to 2013; fifty-five studies met eligibility criteria. Four study types were identified. It was found that study location affects study design, sample rate and studied pathogen category. Approximately 15% (316/2210) of samples from Canadian and US groundwater sources were positive for enteric pathogens, with no difference observed based on system type. Knowledge gaps exist, particularly in exposure assessment for attributing disease to groundwater supplies. Furthermore, there is a lack of consistency in risk factor reporting (local hydrogeology, well type, well use, etc). The widespread use of fecal indicator organisms in reported studies does not inform the assessment of human health risks associated with groundwater supplies. This review illustrates how groundwater study design and location are critical for subsequent data interpretation and use. Knowledge gaps exist related to data on bacterial, viral and protozoan pathogen prevalence in Canadian and US groundwater systems, as well as a need for standardized approaches for reporting study design and results. Fecal indicators are examined as a surrogate for health risk assessments; caution is advised in their widespread use. Study findings may be useful during suspected waterborne outbreaks linked with a groundwater supply to identify the likely etiological agent and potential transport pathway.

Large Outbreak of Cryptosporidium hominis Infection Transmitted through the Public Water Supply, Sweden

Widerström, M., et.al., Emerging Infectious Diseases,Vol 20 No 4, April 2014

In November 2010, ≈27,000 (≈45%) inhabitants of Östersund, Sweden, were affected by a waterborne outbreak of cryptosporidiosis. The outbreak was characterized by a rapid onset and high attack rate, especially among young and middle-aged persons. Young age, number of infected family members, amount of water consumed daily, and gluten intolerance were identified as risk factors for acquiring cryptosporidiosis. Also, chronic intestinal disease and young age were significantly associated with prolonged diarrhea. Identification of Cryptosporidium hominis subtype IbA10G2 in human and environmental samples and consistently low numbers of oocysts in drinking water confirmed insufficient reduction of parasites by the municipal water treatment plant. The current outbreak shows that use of inadequate microbial barriers at water treatment plants can have serious consequences for public health. This risk can be minimized by optimizing control of raw water quality and employing multiple barriers that remove or inactivate all groups of pathogens.

Microbial Contamination Detection in Water Resources: Interest of Current Optical Methods, Trends and Needs in the Context of Climate Change

Jung, A.V., et al., International Journal of Environmental Research and Public Health, 11(4), 4292-4310, April 2014

Microbial pollution in aquatic environments is one of the crucial issues with regard to the sanitary state of water bodies used for drinking water supply, recreational activities and harvesting seafood due to a potential contamination by pathogenic bacteria, protozoa or viruses. To address this risk, microbial contamination monitoring is usually assessed by turbidity measurements performed at drinking water plants. Some recent studies have shown significant correlations of microbial contamination with the risk of endemic gastroenteresis. However the relevance of turbidimetry may be limited since the presence of colloids in water creates interferences with the nephelometric response. Thus there is a need for a more relevant, simple and fast indicator for microbial contamination detection in water, especially in the perspective of climate change with the increase of heavy rainfall events. This review focuses on the one hand on sources, fate and behavior of microorganisms in water and factors influencing pathogens’ presence, transportation and mobilization, and on the second hand, on the existing optical methods used for monitoring microbiological risks. Finally, this paper proposes new ways of research.

Assessing Exposure and Health Consequences of Chemicals in Drinking Water: Current State of Knowledge and Research Needs

Villanueva, C.M., et.al., Environmental Health Perspectives, 122.3:213-221, March 2014

Safe drinking water is essential for well-being. Although microbiological contamination remains the largest cause of water-related morbidity and mortality globally, chemicals in water supplies may also cause disease, and evidence of the human health consequences is limited or lacking for many of them.We aimed to summarize the state of knowledge, identify gaps in understanding, and provide recommendations for epidemiological research relating to chemicals occurring in drinking water. Assessing exposure and the health consequences of chemicals in drinking water is challenging. Exposures are typically at low concentrations, measurements in water are frequently insufficient, chemicals are present in mixtures, exposure periods are usually long, multiple exposure routes may be involved, and valid biomarkers reflecting the relevant exposure period are scarce. In addition, the magnitude of the relative risks tends to be small. Research should include well-designed epidemiological studies covering regions with contrasting contaminant levels and sufficient sample size; comprehensive evaluation of contaminant occurrence in combination with bioassays integrating the effect of complex mixtures; sufficient numbers of measurements in water to evaluate geographical and temporal variability; detailed information on personal habits resulting in exposure (e.g., ingestion, showering, swimming, diet); collection of biological samples to measure relevant biomarkers; and advanced statistical models to estimate exposure and relative risks, considering methods to address measurement error. Last, the incorporation of molecular markers of early biological effects and genetic susceptibility is essential to understand the mechanisms of action. There is a particular knowledge gap and need to evaluate human exposure and the risks of a wide range of emerging contaminants.

Spatial analysis of boil water advisories issued during an extreme weather event in the Hudson River Watershed, USA

Vedachalam, S., et.al., Applied Geography, 48:112-121, March 2014

Water infrastructure in the United States is aging and vulnerable to extreme weather. In August 2011, Tropical Storm Irene hit the eastern part of New York and surrounding states, causing great damage to public drinking water systems. Several water supply districts issued boil water advisories (BWAs) to their customers as a result of the storm. This study seeks to identify the major factors that lead water supply systems to issue BWAs by assessing watershed characteristics, water supply system characteristics and treatment plant parameters of water districts in the Mohawk-Hudson River watershed in New York. Logistic regression model suggests that the probability of a BWA being issued by a water supply district is enhanced by higher precipitation during the storm, high density of septic systems, lack of recent maintenance and low population density. Interviews with water treatment plant operators suggested physical damage to water distribution systems were the main causes of boil water advisories during storms. BWAs result in additional costs to residents and communities, and the public compliance of the advisory instructions is low, so efforts must be made to minimize their occurrence. Prior investments in infrastructure management can proactively address municipal water supply and quality issues.

 

Epidemiology and estimated costs of a large waterborne outbreak of norovirus infection in Sweden

Larsson, C., et al., Epidemiology and Infection, 142(3):592-600, March 2014

A large outbreak of norovirus (NoV) gastroenteritis caused by contaminated municipal drinking water occurred in Lilla Edet, Sweden, 2008. Epidemiological investigations performed using a questionnaire survey showed an association between consumption of municipal drinking water and illness (odds ratio 4·73, 95% confidence interval 3·53-6·32), and a strong correlation between the risk of being sick and the number of glasses of municipal water consumed. Diverse NoV strains were detected in stool samples from patients, NoV genotype I strains predominating. Although NoVs were not detected in water samples, coliphages were identified as a marker of viral contamination. About 2400 (18·5%) of the 13,000 inhabitants in Lilla Edet became ill. Costs associated with the outbreak were collected via a questionnaire survey given to organizations and municipalities involved in or affected by the outbreak. Total costs including sick leave, were estimated to be ∼8,700,000 Swedish kronor (∼€0·87 million).

Methyl Tertiary Butyl Ether (MTBE) and Other Volatile Organic Compounds (VOCs) in Public Water Systems, Private Wells, and Ambient Groundwater Wells in New Jersey Compared to Regulatory and Human-Health Benchmarks

Williams, P.R.D., Environmental Forensics, Volume 15, Issue 1, February 2014

Potential threats to drinking water and water quality continue to be a major concern in many regions of the United States. New Jersey, in particular, has been at the forefront of assessing and managing potential contamination of its drinking water supplies from hazardous substances. The purpose of the current analysis is to provide an up-to-date evaluation of the occurrence and detected concentrations of methyl tertiary butyl ether (MTBE) and several other volatile organic compounds (VOCs) in public water systems, private wells, and ambient groundwater wells in New Jersey based on the best available data, and to put these results into context with federal and state regulatory and human-health benchmarks. Analyses are based on the following three databases that contain water quality monitoring data for New Jersey: Safe Drinking Water Information System (SDWIS), Private Well Testing Act (PWTA), and National Water Information System (NWIS). For public water systems served by groundwater in New Jersey, MTBE was detected at a concentration ≥10 μg/L, ≥20 μg/L, and ≥70 μg/L at least once in 30 (2%), 21 (1.4%), and five (0.3%) of sampled systems from 1997 to 2011, respectively. For private wells in New Jersey, MTBE was detected at a concentration ≥10 μg/L, ≥20 μg/L, and ≥70 μg/L at least once in 385 (0.5%), 183 (0.2%), and 46 (0.05%) of sampled wells from 2001 to 2011, respectively. For ambient groundwater wells in New Jersey, MTBE was detected at a concentration ≥10 μg/L, ≥20 μg/L, and ≥70 μg/L at least once in 14 (2.1%), 9 (1.3%), and 4 (0.6%) of sampled wells from 1993 to 2012, respectively. Average detected concentrations of MTBE, as well as detected concentrations at upper-end percentiles, were less than corresponding benchmarks for all three datasets. The available data show that MTBE is rarely detected in various source waters in New Jersey at a concentration that exceeds the State’s health-based drinking water standard or other published benchmarks, and there is no evidence of an increasing trend in the detection frequency of MTBE. Other VOCs, such as tetrachloroethylene (PCE), trichloroethylene (TCE), and benzene, are detected more often above corresponding regulatory or human-health benchmarks due to their higher detected concentrations in water and/or greater toxicity values. The current analysis provides useful data for evaluating the nature and extent of historical and current contamination of water supplies in New Jersey and potential opportunities for public exposures and health risks due to MTBE and other VOCs on a statewide basis. Additional forensic or forecasting analyses are required to identify the sources or timing of releases of individual contaminants at specific locations or to predict potential future water contamination in New Jersey.

Widespread Molecular Detection of Legionella pneumophila Serogroup 1 in Cold Water Taps across the United States

Donohue, M.J., Environmental Science and Technology, 48 (6), pp 3145–3152, February 2014

In the United States, 6,868 cases of legionellosis were reported to the Center for Disease Control and Prevention in 2009–2010. Of these reports, it is estimated that 84% are caused by the microorganism Legionella pneumophila Serogroup (Sg) 1. Legionella spp. have been isolated and recovered from a variety of natural freshwater environments. Human exposure to L. pneumophila Sg1 may occur from aerosolization and subsequent inhalation of household and facility water. In this study, two primer/probe sets (one able to detect L. pneumophila and the other L. pneumophila Sg1) were determined to be highly sensitive and selective for their respective targets. Over 272 water samples, collected in 2009 and 2010 from 68 public and private water taps across the United States, were analyzed using the two qPCR assays to evaluate the incidence of L. pneumophila Sg1. Nearly half of the taps showed the presence of L. pneumophila Sg1 in one sampling event, and 16% of taps were positive in more than one sampling event. This study is the first United States survey to document the occurrence and colonization of L. pneumophila Sg1 in cold water delivered from point of use taps.

Perspectives on drinking water monitoring for small scale water systems

Roig, B., Baures, E., Thomas, O., Water Science & Technology: Water Supply, Vol. 14 Issue 1, p1, January 2014

Drinking water (DW) is increasingly subject to environmental and human threats that alter the quality of the resource and potentially of the distributed water. These threats can be both biological and chemical in nature, and are often cumulated. The increase of technical frame of water quality monitoring following the evolution of water quality standards guarantee the regulation compliance in general but is not sufficient for the survey of small scale water system efficiency. The existing monitoring is not well suited to insure a good quality of distributed water, especially in the event of a sudden modification of quality. This article aims to propose alternative solutions, from the examination of monitoring practices, in a bid to limit the risk of deterioration of DW quality.

Drinking Water Microbial Myths

Martin, J.A., et al., Critical Reviews in Microbiology, November 2013

Accounts of drinking water-borne disease outbreaks have always captured the interest of the public, elected and health officials, and the media. During the twentieth century, the drinking water community and public health organizations have endeavored to craft regulations and guidelines on treatment and management practices that reduce risks from drinking water, specifically human pathogens. During this period there also evolved misunderstandings as to potential health risk associated with microorganisms that may be present in drinking waters. These misunderstanding or “myths” have led to confusion among the many stakeholders. The purpose of this article is to provide a scientific- and clinically-based discussion of these “myths” and recommendations for better ensuring the microbial safety of drinking water and valid public health decisions.

Assessing the impact of chlorinated-solvent sites on metropolitan groundwater resources

Brusseau, M.L. and Narter, M., Ground Water, November 2013

Chlorinated-solvent compounds are among the most common groundwater contaminants in the United States. A majority of the many sites contaminated by chlorinated-solvent compounds are located in metropolitan areas, and most such areas have one or more chlorinated-solvent contaminated sites. Thus, contamination of groundwater by chlorinated-solvent compounds may pose a potential risk to the sustainability of potable water supplies for many metropolitan areas. The impact of chlorinated-solvent sites on metropolitan water resources was assessed for Tucson, Arizona, by comparing the aggregate volume of extracted groundwater for all pump-and-treat systems associated with contaminated sites in the region to the total regional groundwater withdrawal. The analysis revealed that the aggregate volume of groundwater withdrawn for the pump-and-treat systems operating in Tucson, all of which are located at chlorinated-solvent contaminated sites, was 20% of the total groundwater withdrawal in the city for the study period. The treated groundwater was used primarily for direct delivery to local water supply systems or for reinjection as part of the pump-and-treat system. The volume of the treated groundwater used for potable water represented approximately 13% of the total potable water supply sourced from groundwater, and approximately 6% of the total potable water supply. This case study illustrates the significant impact chlorinated-solvent contaminated sites can have on groundwater resources and regional potable water supplies.

Radon-contaminated drinking water from private wells: an environmental health assessment examining a rural Colorado mountain community’s exposure

Cappello, M.A., et. al., Journal of Environmental Health, November 2013

In the study discussed in this article, 27 private drinking water wells located in a rural Colorado mountain community were sampled for radon contamination and compared against (a) the U.S. Environmental Protection Agency’s (U.S. EPA’s) proposed maximum contaminant level (MCL), (b) the U.S. EPA proposed alternate maximum contaminate level (AMCL), and (c) the average radon level measured in the local municipal drinking water system. The data from the authors’ study found that 100% of the wells within the study population had radon levels in excess of the U.S. EPA MCL, 37% were in excess of the U.S. EPA AMCL, and 100% of wells had radon levels greater than that found in the local municipal drinking water system. Radon contamination in one well was found to be 715 times greater than the U.S. EPA MCL, 54 times greater than the U.S. EPA AMLC, and 36,983 times greater than that found in the local municipal drinking water system. According to the research data and the reviewed literature, the results indicate that this population has a unique and elevated contamination profile and suggest that radon-contaminated drinking water from private wells can present a significant public health concern.

Microbial Health Risks of Regulated Drinking Water in the United States

Edberg, S.C., DWRF, September 2013

Drinking water regulations are designed to protect the public health. In the United States, the Environmental Protection Agency (EPA) is tasked with developing and maintaining drinking water regulations for the 276,607,387 people served by the country’s 54,293 community water systems. The Food and Drug Administration (FDA) regulates bottled water as a food product. By federal law, the FDA’s regulations for bottled water must be at least as protective of public health as the EPA’s regulations for public water system drinking water. Despite many similarities in EPA and FDA regulations, consumer perception regarding the safety of drinking waters varies widely. This paper examines and compares the microbial health risks of tap water and bottled water, specifically examining differences in quality monitoring, regulatory standards violations, advisories, and distribution system conditions. It also includes comparison data on the number of waterborne illness outbreaks caused by both tap and bottled water. Based on a review of existing research, it is clear that as a consequence of the differences in regulations, distribution systems, operating (manufacturing) practices, and microbial standards of quality, public drinking water supplies present a substantially higher human risk than do bottled waters for illness due to waterborne organisms.

The mineral content of tap water in United States households

Patterson, K.Y., et. al., Journal of Food Composition and Analysis, August 2013

The composition of tap water contributes to dietary intake of minerals. The Nutrient Data Laboratory (NDL) of the United States Department of Agriculture (USDA) conducted a study of the mineral content of residential tap water, to generate current data for the USDA National Nutrient Database. Sodium, potassium, calcium, magnesium, iron, copper, manganese, phosphorus, and zinc content of drinking water were determined in a nationally representative sampling. The statistically designed sampling method identified 144 locations for water collection in winter and spring from home taps. Assuming a daily consumption of 1 L of tap water, only four minerals (Cu, Ca, Mg, and Na), on average, provided more than 1% of the US dietary reference intake. Significant decreases in calcium were observed with chemical water softeners, and between seasonal pickups for Mg and Ca. The variance of sodium was significantly different among regions (p < 0.05) but no differences were observed as a result of collection time, water source or treatment. Based on the weighted mixed model results, there were no significant differences in overall mineral content between municipal and well water. These results, which are a nationally representative dataset of mineral values for drinking water available from home taps, provides valuable additional information for assessment of dietary mineral intake.

Quantitative analysis of microbial contamination in private drinking water supply systems

Allevi, R.P., et al., Journal of Water and Health, June 2013

Over one million households rely on private water supplies (e.g. well, spring, cistern) in the Commonwealth of Virginia, USA. The present study tested 538 private wells and springs in 20 Virginia counties for total coliforms (TCs) and Escherichia coli along with a suite of chemical contaminants. A logistic regression analysis was used to investigate potential correlations between TC contamination and chemical parameters (e.g. NO3(-), turbidity), as well as homeowner-provided survey data describing system characteristics and perceived water quality. Of the 538 samples collected, 41% (n = 221) were positive for TCs and 10% (n = 53) for E. coli. Chemical parameters were not statistically predictive of microbial contamination. Well depth, water treatment, and farm location proximate to the water supply were factors in a regression model that predicted presence/absence of TCs with 74% accuracy. Microbial and chemical source tracking techniques (Bacteroides gene Bac32F and HF183 detection via polymerase chain reaction and optical brightener detection via fluorometry) identified four samples as likely contaminated with human wastewater.

Strontium Concentrations in Corrosion Products from Residential Drinking Water Distribution Systems

Gerke, et al., Environmental Science and Technology, April 22, 2013.

The United States Environmental Protection Agency (US EPA) will require some U.S. drinking water distribution systems (DWDS) to monitor nonradioactive strontium (Sr2+) in drinking water in 2013. Iron corrosion products from four DWDS were examined to assess the potential for Sr2+ binding and release. Average Sr2+ concentrations in the outermost layer of the corrosion products ranged from 3 to 54 mg kg–1 and the Sr2+ drinking water concentrations were all ≤0.3 mg L–1. Micro-X-ray adsorption near edge structure spectroscopy and linear combination fitting determined that Sr2+ was principally associated with CaCO3. Sr2+ was also detected as a surface complex associated with α-FeOOH. Iron particulates deposited on a filter inside a home had an average Sr2+ concentration of 40.3 mg kg–1 and the associated drinking water at a tap was 210 μg L–1. The data suggest that elevated Sr2+ concentrations may be associated with iron corrosion products that, if disturbed, could increase Sr2+ concentrations above the 0.3 μg L–1 US EPA reporting threshold. Disassociation of very small particulates could result in drinking water Sr2+ concentrations that exceed the US EPA health reference limit (4.20 mg kg–1 body weight).

Evaluating violations of drinking water regulations

Rubin, S.J., Journal, American Water Works Association, March 2013

US Environmental Protection Agency data were analyzed for violations by community water systems (CWSs). Several characteristics were evaluated, including size, source water, and violation type. The data show that: (1) 55% of CWSs violated at least one regulation under the Safe Drinking Water Act that involved systems serving more than 95 million people; (2) the presence of violations was no different for groundwater and surface water systems; (3) fewer than 20% of CWSs with violations exceeded an allowable level of a contaminant in drinking water; (4) smaller water systems are no more likely than larger systems, except very large systems, to violate health-related requirements; and (5) smaller CWSs appear more likely than larger systems to violate monitoring, reporting, and notification requirements. An evaluation was also conducted of four contaminants that had health-related violations by more than 1% of CWSs: total coliform, stage 1 disinfection by-products, arsenic, and lead and copper.

Lead (Pb) quantification in potable water samples: implications for regulatory compliance and assessment of human exposure

Triantafyllidou, S., et al., Environmental Monitoring and Assessment, February 2013

Assessing the health risk from lead (Pb) in potable water requires accurate quantification of the Pb concentration. Under worst-case scenarios of highly contaminated water samples, representative of public health concerns, up to 71-98 % of the total Pb was not quantified if water samples were not mixed thoroughly after standard preservation (i.e., addition of 0.15 % (v/v) HNO(3)). Thorough mixing after standard preservation improved recovery in all samples, but 35-81 % of the total Pb was still un-quantified in some samples. Transfer of samples from one bottle to another also created high errors (40-100 % of the total Pb was un-quantified in transferred samples). Although the United States Environmental Protection Agency’s standard protocol avoids most of these errors, certain methods considered EPA-equivalent allow these errors for regulatory compliance sampling. Moreover, routine monitoring for assessment of human Pb exposure in the USA has no standardized protocols for water sample handling and pre-treatment. Overall, while there is no reason to believe that sample handling and pre-treatment dramatically skew regulatory compliance with the US Pb action level, slight variations from one approved protocol to another may cause Pb-in-water health risks to be significantly underestimated, especially for unusual situations of “worst case” individual exposure to highly contaminated water.

The need for congressional action to finance arsenic reductions in drinking water

Levine, R.L., Journal of Environmental Health, November 2012

Many public water systems in the U.S. are unsafe because the communities cannot afford to comply with the current 10 parts per billion (ppb) federal arsenic standard for drinking water. Communities unable to afford improvements remain vulnerable to adverse health effects associated with higher levels of arsenic exposure. Scientific and bipartisan political consensus exists that the arsenic standard should not be less stringent than 10 ppb, and new data suggest additional adverse health effects related to arsenic exposure through drinking water. Congress has failed to reauthorize the Drinking Water State Revolving Fund program to provide reliable funding to promote compliance and reduce the risk of adverse health effects. Congress’s recent ad hoc appropriations do not allow long-term planning and ongoing monitoring and maintenance. Investing in water infrastructure will lower health care costs and create American jobs. Delaying necessary upgrades will only increase the costs of improvements over time.

Direct healthcare costs of selected diseases primarily or partially transmitted by water

Collier, S.A., et al., Epidemiology and Infection, November 2012

Despite US sanitation advancements, millions of waterborne disease cases occur annually, although the precise burden of disease is not well quantified. Estimating the direct healthcare cost of specific infections would be useful in prioritizing waterborne disease prevention activities. Hospitalization and outpatient visit costs per case and total US hospitalization costs for ten waterborne diseases were calculated using large healthcare claims and hospital discharge databases. The five primarily waterborne diseases in this analysis (giardiasis, cryptosporidiosis, Legionnaires’ disease, otitis externa, and non-tuberculous mycobacterial infection) were responsible for over 40 000 hospitalizations at a cost of $970 million per year, including at least $430 million in hospitalization costs for Medicaid and Medicare patients. An additional 50 000 hospitalizations for campylobacteriosis, salmonellosis, shigellosis, haemolytic uraemic syndrome, and toxoplasmosis cost $860 million annually ($390 million in payments for Medicaid and Medicare patients), a portion of which can be assumed to be due to waterborne transmission.

Arcobacter in Lake Erie Beach Waters: an Emerging Gastrointestinal Pathogen Linked with Human-Associated Fecal Contamination

Lee, C., et al., Applied and Environmental Microbiology, September 2012

The genus Arcobacter has been associated with human illness and fecal contamination by humans and animals. To better characterize the health risk posed by this emerging waterborne pathogen, we investigated the occurrence of Arcobacter spp. in Lake Erie beach waters. During the summer of 2010, water samples were collected 35 times from the Euclid, Villa Angela, and Headlands (East and West) beaches, located along Ohio’s Lake Erie coast. After sample concentration, Arcobacter was quantified by real-time PCR targeting the Arcobacter 23S rRNA gene. Other fecal genetic markers (Bacteroides 16S rRNA gene [HuBac], Escherichia coli uidA gene, Enterococcus 23S rRNA gene, and tetracycline resistance genes) were also assessed. Arcobacter was detected frequently at all beaches, and both the occurrence and densities of Arcobacter spp. were higher at the Euclid and Villa Angela beaches (with higher levels of fecal contamination) than at the East and West Headlands beaches. The Arcobacter density in Lake Erie beach water was significantly correlated with the human-specific fecal marker HuBac according to Spearman’s correlation analysis (r = 0.592; P < 0.001). Phylogenetic analysis demonstrated that most of the identified Arcobacter sequences were closely related to Arcobacter cryaerophilus, which is known to cause gastrointestinal diseases in humans. Since human-pathogenic Arcobacter spp. are linked to human-associated fecal sources, it is important to identify and manage the human-associated contamination sources for the prevention of Arcobacter-associated public health risks at Lake Erie beaches.

The Quality of Drinking Water in North Carolina Farmworker Camps

Bischoff, W.E., MD, PhD, et al.American Journal of Public Health, August 2012

The purpose of this study was to assess water quality in migrant farmworker camps in North Carolina and determine associations of water quality with migrant farmworker housing characteristics. Researchers collected data from 181 farmworker camps in eastern North Carolina during the 2010 agricultural season. Water samples were tested using the Total Coliform Rule (TCR) and housing characteristics were assessed using North Carolina Department of Labor standards. A total of 61 (34%) of 181 camps failed the TCR. Total coliform bacteria were found in all 61 camps, with Escherichia coli also being detected in 2. Water quality was not associated with farmworker housing characteristics or with access to registered public water supplies. Multiple official violations of water quality standards had been reported for the registered public water supplies. They concluded that water supplied to farmworker camps often does not comply with current standards and poses a great risk to the physical health of farmworkers and surrounding communities. Expansion of water monitoring to more camps and changes to the regulations such as testing during occupancy and stronger enforcement are needed to secure water safety.

Chemical mixtures in untreated water from public-supply wells in the U.S. — Occurrence, composition, and potential toxicity

Toccalino, P.L., Norman, J.E., Scott, J.C., Science of The Total Environment, August 2012

Chemical mixtures are prevalent in groundwater used for public water supply, but little is known about their potential health effects. As part of a large-scale ambient groundwater study, we evaluated chemical mixtures across multiple chemical classes, and included more chemical contaminants than in previous studies of mixtures in public-supply wells. We (1) assessed the occurrence of chemical mixtures in untreated source-water samples from public-supply wells, (2) determined the composition of the most frequently occurring mixtures, and (3) characterized the potential toxicity of mixtures using a new screening approach. The U.S. Geological Survey collected one untreated water sample from each of 383 public wells distributed across 35 states, and analyzed the samples for as many as 91 chemical contaminants. Concentrations of mixture components were compared to individual human-health benchmarks; the potential toxicity of mixtures was characterized by addition of benchmark-normalized component concentrations. Most samples (84%) contained mixtures of two or more contaminants, each at concentrations greater than one-tenth of individual benchmarks. The chemical mixtures that most frequently occurred and had the greatest potential toxicity primarily were composed of trace elements (including arsenic, strontium, or uranium), radon, or nitrate. Herbicides, disinfection by-products, and solvents were the most common organic contaminants in mixtures. The sum of benchmark-normalized concentrations was greater than 1 for 58% of samples, suggesting that there could be potential for mixtures toxicity in more than half of the public-well samples. Our findings can be used to help set priorities for groundwater monitoring and suggest future research directions for drinking-water treatment studies and for toxicity assessments of chemical mixtures in water resources.

Risk of Viral Acute Gastrointestinal Illness from Nondisinfected Drinking Water Distribution Systems

Lambertini, E., et al., Environmental Science and Technology, July 2012

Acute gastrointestinal illness (AGI) resulting from pathogens directly entering the piping of drinking water distribution systems is insufficiently understood. Here, we estimate AGI incidence from virus intrusions into the distribution systems of 14 nondisinfecting, groundwater-source, community water systems. Water samples for virus quantification were collected monthly at wells and households during four 12-week periods in 2006–2007. Ultraviolet (UV) disinfection was installed on the communities’ wellheads during one study year; UV was absent the other year. UV was intended to eliminate virus contributions from the wells and without residual disinfectant present in these systems, any increase in virus concentration downstream at household taps represented virus contributions from the distribution system (Approach 1). During no-UV periods, distribution system viruses were estimated by the difference between well water and household tap virus concentrations (Approach 2). For both approaches, a Monte Carlo risk assessment framework was used to estimate AGI risk from distribution systems using study-specific exposure–response relationships. Depending on the exposure–response relationship selected, AGI risk from the distribution systems was 0.0180–0.0661 and 0.001–0.1047 episodes/person-year estimated by Approaches 1 and 2, respectively. These values represented 0.1–4.9% of AGI risk from all exposure routes, and 1.6–67.8% of risk related to drinking water exposure. Virus intrusions into nondisinfected drinking water distribution systems can contribute to sporadic AGI.

Methodological Aspects of Fluid Intake Records and Surveys

Vergne, S. PhD, Nutrition Today, July/August 2012

Assessing the fluid intake level of different populations has, to date, attracted very little interest. The comparison of existing data based on food surveys reveals notable differences between countries and within different surveys in 1 country. Methodological issues seem to account to a large extent for these differences. Recent studies conducted using specifically designed diaries to record fluid and water intake over a 7-day period tend to give more accurate results. These recent studies could potentially lead to the revision of the values of adequate intakes of water in numerous countries.

 

Screening-Level Risk Assessment of Coxiella burnetii (Q Fever) Transmission via Aeration of Drinking Water

Sales-Ortells, H., Medema, G., Environmental Science and Technology, April 2012

A screening-level risk assessment of Q fever transmission through drinking water produced from groundwater in the vicinity of infected goat barnyards that employed aeration of the water was performed. Quantitative data from scientific literature were collected and a Quantitative Microbial Risk Assessment approach was followed. An exposure model was developed to calculate the dose to which consumers of aerated groundwater are exposed through aerosols inhalation during showering. The exposure assessment and hazard characterization were integrated in a screening-level risk characterization using a dose-response model for inhalation to determine the risk of Q fever through tap water. A nominal range sensitivity analysis was performed. The estimated risk of disease was lower than 10(-4) per person per year (pppy), hence the risk of transmission of C. burnetii through inhalation of drinking water aerosols is very low. The sensitivity analysis shows that the most uncertain parameters are the aeration process, the transport of C. burnetii in bioaerosols via the air, the aerosolization of C. burnetii in the shower, and the air filtration efficiency. The risk was compared to direct airborne exposure of persons in the vicinity of infected goat farms; the relative risk of exposure through inhalation of drinking water aerosols was 0.002%.

Waterborne Pathogens: Emerging Issues in Monitoring, Treatment and Control

Reynolds, K.A., MSPH, Ph.D., Water Conditioning & Purification, March 2012

Microbial threats to water quality continue to emerge; however, technologies for monitoring, treating and controlling emerging waterborne pathogens are also evolving. Understanding the range of factors that lead to the contamination of water are important for developing appropriate tools to manage human health risks.

Health Risks of Limited-Contact Water Recreation

Dorevitch, S., et al., Environmental Health Perspectives, February 2012

Wastewater-impacted waters that do not support swimming are often used for boating, canoeing, fishing, kayaking, and rowing. Little is known about the health risks of these limited-contact water recreation activities. We evaluated the incidence of illness, severity of illness, associations between water exposure and illness, and risk of illness attributable to limited-contact water recreation on waters dominated by wastewater effluent and on waters approved for general use recreation (such as swimming). The Chicago Health, Environmental Exposure, and Recreation Study was a prospective cohort study that evaluated five health outcomes among three groups of people: those who engaged in limited-contact water recreation on effluent-dominated waters, those who engaged in limited-contact recreation on general-use waters, and those who engaged in non–water recreation. Data analysis included survival analysis, logistic regression, and estimates of risk for counterfactual exposure scenarios using G-computation. Telephone follow-up data were available for 11,297 participants. With non–water recreation as the reference group, we found that limited-contact water recreation was associated with the development of acute gastrointestinal illness in the first 3 days after water recreation at both effluent-dominated waters [adjusted odds ratio (AOR) 1.46; 95% confidence interval (CI): 1.08, 1.96] and general-use waters (1.50; 95% CI: 1.09, 2.07). For every 1,000 recreators, 13.7 (95% CI: 3.1, 24.9) and 15.1 (95% CI: 2.6, 25.7) cases of gastrointestinal illness were attributable to limited-contact recreation at effluent-dominated waters and general-use waters, respectively. Eye symptoms were associated with use of effluent-dominated waters only (AOR 1.50; 95% CI: 1.10, 2.06). Among water recreators, our results indicate that illness was associated with the amount of water exposure. Limited-contact recreation, both on effluent-dominated waters and on waters designated for general use, was associated with an elevated risk of gastrointestinal illness.

Planning for Sustainability: A Handbook for Water and Wastewater Utilities

U.S. Environmental Protection Agency, February 2012

This handbook is intended to provide information about how to enhance current planning processes by building in sustainability considerations. It is designed to be useful for various types and scales of planning efforts, such as: Long-range integrated water resource planning, Strategic planning, Capital planning, System-wide planning to meet regulatory requirements (e.g., combined sewer overflow upgrades and new stormwater permitting requirements), Specific infrastructure project planning (e.g., for repair, rehabilitation, or replacement of specific infrastructure)

Atrazine Exposure in Public Drinking Water and Preterm Birth

Rinsky, J.L., et al., Public Health Reports, January/February 2012

Approximately 13% of all births occur prior to 37 weeks gestation in the U.S. Some established risk factors exist for preterm birth, but the etiology remains largely unknown. Recent studies have suggested an association with environmental exposures. We examined the relationship between preterm birth and exposure to a commonly used herbicide, atrazine, in drinking water. We reviewed Kentucky birth certificate data for 2004-2006 to collect duration of pregnancy and other individual-level covariates. We assessed existing data sources for atrazine levels in public drinking water for the years 2000-2008, classifying maternal county of residence into three atrazine exposure groups. We used logistic regression to analyze the relationship between atrazine exposure and preterm birth, controlling for maternal age, race/ethnicity, education, smoking, and prenatal care. An increase in the odds of preterm birth was found for women residing in the counties included in the highest atrazine exposure group compared with women residing in counties in the lowest exposure group, while controlling for covariates. Analyses using the three exposure assessment approaches produced odds ratios ranging from 1.20 (95% confidence interval [CI] 1.14, 1.27) to 1.26 (95% CI 1.19, 1.32), for the highest compared with the lowest exposure group. Suboptimal characterization of environmental exposure and variables of interest limited the analytical options of this study. Still, our findings suggest a positive association between atrazine and preterm birth, and illustrate the need for an improved assessment of environmental exposures to accurately address this important public health issue.

Source Water Protection Vision and Roadmap

Water Research Foundation, January 2012

In 2007, a group of source water protection experts met, under the auspices of the Water Research Foundation and the Water Environment Research Foundation, to develop a research agenda that would ultimately provide information to help drinking water suppliers design and implement effective source water protection programs. A key result of that effort identified the need for a national vision and roadmap that would guide U.S. water utilities and supporting groups with a unified strategy for coherent, consistent, cost-effective, and socially-acceptable source water protection programs. This brief document presents the vision and roadmap and focuses on how to move forward on source water protection. The roadmap is intended to serve as a feasible, focused path toward promoting source water protection for U.S. drinking water utilities. It is not intended to serve as an official directive, but rather is a collection of observations and recommendations organized to form a path to achieving the vision. The companion document Developing a Vision and Roadmap for Drinking Water Source Protection comprehensively covers the project team’s findings regarding the various building blocks to make source water protection a reality. That document includes an annotated bibliography of source water protection resources, a summation of a literature review, and helpful water utility case studies. Both documents are meant to be used in concert to help water utilities move forward with their source water protection efforts and proactively improve and/or maintain the quality of their drinking water sources. Source water protection has been discussed and promoted in an ad hoc fashion by different organizations at the national, regional, state, and local levels. It is essential to increase the awareness of source water protection at the national level. Education of decision makers, utility managers, stakeholders, and the general public should be the first step in moving source water protection up a path to success. Leadership is needed to make this a national priority. In order to ensure the various actions recommended in the roadmap can be carried out, it is recommended that both a top-down and a bottom-up approach be taken. A top-down approach would establish a flexible framework to guide local entities (e.g., water systems, watershed organizations, and regional planning agencies) to work together to protect source water. Due to the variability of source waters and the areas from which they are derived, along with technical, social, political, financial, and regulatory differences across jurisdictions, it is unlikely that two source water protection programs would be the same. A bottom-up approach is therefore also needed, which would use local information and broad stakeholder involvement to produce a “tailored” source water protection program that addresses unique issues at the local level.

Migration of Bisphenol-A into the Natural Spring Water Packaged in Polycarbonate Carboys

Erdem, Y.K., Furkan, A., International Journal of Applied Science and Technology, January 2012

Bisphenol-A is a widely used chemical in the structure of epoxy resins, polycarbonate packages, lacquer of metal food packages all over the world. Its weak estrogenic character and possible health effects are well known. For this reason the usage of the Bisphenol-A in food packages is limited and it’s daily intake by human is restrictly under control. The declaration of specific migration limit is 0.6 ppm, the tolerable daily intake is 0.05mg/kg body weight per day by EFSA and other authorities. The EFSA and others prevent the manufacturing and using of Bisphenol-A in baby bottles in 2010. In Turkey, the 70% of the population are living in 5 metropolitan cities and the drinking water consumption is mostly supplied by packaged drinking water industry. The household and bulk usage is covered by natural spring and natural mineral water packaged in 19 liters polycarbonate carboys. That’s why the possible migration of Bisphenol-A in drinking water packaged in polycarbonate carboys was decided to investigate. First of all, a screening test was carried out in the samples supplied by two main cities. And then 5 different trade mark packaged water samples was stored at 4, 25, and 35oC for 60 days and Bisphenol-A content was determined in given intervals. It is found that the BPA migration was detected at least 450 times lower than the specific migration limit of EFSA during 60 days storage at these conditions.

Bottled Water & Tap Water: Just the Facts

Drinking Water Research Foundation, October 2011

The information presented in this report supports the fact that drinking water, whether from the tap or a bottle, is generally safe, and that regulatory requirements for both tap water and bottled water provide Americans with clean, safe drinking water. There are some differences in regulations for each, but those differences highlight the differences between drinking water delivered by a public water system and drinking water delivered to the consumer in a sealed container. Perhaps the most notable difference between tap water and bottled water is the method of delivery. Community water systems deliver water to consumers (businesses and private residences) through miles of underground iron (unlined and poly-lined), PVC, and lead service lines that can be subject to leakage with age of the system and accidental failures, resulting in the risk of post-treatment contamination of the water that is delivered to consumers. Bottled water is delivered to consumers in sanitary, sealed containers that were filled in a bottling facility under controlled conditions in a fill room.

Bromate reduction in simulated gastric juice

Cotruvo, J.A., et al., e-Journal AWWA, November 2010

This article advocates for a revised risk assessment for bromate to reflect presystemic chemistry not usually considered when low-dose risks are calculated from high-dose toxicology data. Because of high acidity and the presence of reducing agents, presystemic decomposition of bromate can begin in the stomach, which should contribute to lower-than- expected doses to target organs. In this research, bromate decomposition kinetics with simulated stomach/gastric juice were studied to determine the risk of environmentally relevant exposure to bromate. The current work is the first step in a series of studies that the authors are conducting to better estimate the hypothetical low-dose risks to humans from drinking water ingestion and thus arrive at more appropriate maximum contaminant levels (MCLs). It is the authors’ belief that additional kinetics and metabolism research will demonstrate that the human risk from ingestion of compounds in drinking water is less than originally believed and will lead to MCLs and MCL goals that are more scientifically based.

Drinking Water and Risk of Stroke

Gustavo Saposnik, MD, MSc, FAHA, Stroke, October 2010

In the present issue of Stroke, the authors investigate the association between low-level arsenic exposure in drinking water and the ischemic stroke admissions in Michigan. They found that even low exposure to arsenic is associated with an increased incident risk of stroke (relative risk, 1.03; 95% CI, 1.01 to 1.05 per µg/L increase in arsenic concentration). The authors also compared whether that exposure was associated with other nonvascular conditions (hernia, duodenal ulcer) not expected to increase their risk. Comparing zip codes in Genesee County at the 90th percentile of arsenic levels (21.6 µg/L) with those at the 10th percentile (0.30 µg/L), there was a 91% increase in risk of stroke admission (relative risk, 1.91; 95% CI, 1.27 to 2.88). The results were consistent in showing an increased risk for stroke, but not for other control medical conditions (hernia and duodenal ulcer). Moreover, they found a graded effect: a higher incident risk among those individuals exposed to higher water concentrations of arsenic (Figure 2).

Association between children’s blood lead levels, lead service lines, and water disinfection

Brown, M.J., Raymond, J., Homa, D., Kennedy, C., Sinks, T., Environmental Research, October 2010

Evaluate the effect of changes in the water disinfection process, and presence of lead service lines (LSLs), on children’s blood lead levels (BLLs) in Washington, DC. Three cross-sectional analyses examined the relationship of LSL and changes in water disinfectant with BLLs in children o6 years of age. The study population was derived from the DC Childhood Lead Poisoning Prevention Program blood lead surveillance system of children who were tested and whose blood lead test results were reported to the DC Health Department. The Washington, DC Water and Sewer Authority (WASA) provided information on LSLs. The final study population consisted of 63,854 children with validated addresses. Controlling for age of housing, LSL was an independent risk factor for BLLs Z10 mg/dL, and Z5 mg/dL even during time periods whenwater levelsmet theUS Environmental Protection Agency (EPA) action level of 15 parts per billion (ppb). When chloramine alone was used to disinfect water, the risk for BLL in the highest quartile among children in homes with LSL was greater than when either chlorine or chloramine with orthophosphate was used. For children tested after LSLs in their houses were replaced, those with partially replaced LSL were 43 times as likely to have BLLs Z10 mg/dL versus children who never had LSLs. LSLs were a risk factor for elevated BLLs even when WASA met the EPA water action level. Changes in water disinfection can enhance the effect of LSLs and increase lead exposure. Partially replacing LSLs may not decrease the risk of elevated BLLs associated with LSL exposure.

When is the Next Boil Water Alert?

Water Technology, August 2010

A common theme we see on a daily basis relates to drinking water infrastructure. We track news throughout the world that impacts the drinking water industry, and one of the most frequent things we see are notices from agencies and organizations about the need for communities to boil water in order to combat possible contamination. In some parts of the world, boiling water is the norm due to water supply issues. Often, these areas may be limited in their ability to develop economically, as clean water is such an integral part of daily life. It is in the developed world, however, where we have been seeing a large increase in the number of such notices.

Climate Change, Water, and Risk: Current Water Demands Are Not Sustainable

www.nrdc.org, July 2010

Climate change will have a significant impact on the sustainability of water supplies in the coming decades. A new analysis, performed by consulting firm Tetra Tech for the Natural Resources Defense Council (NRDC), examined the effects of global warming on water supply and demand in the contiguous United States. The study found that more than 1,100 counties— one-third of all counties in the lower 48—will face higher risks of water shortages by mid-century as the result of global warming. More than 400 of these counties will face extremely high risks of water shortages.

Water Disinfection By-Products and the Risk of Specific Birth Defects: A Population-Based Cross-Sectional Study in Taiwan

Hwang, B.-F., Jaakkola, J., Guo, H.-R., Environmental Health,  June 2008

Recent findings suggest that exposure to disinfection by-products may increase the risk of birth defects. Previous studies have focused mainly on birth defects in general or groups of defects. The objective of the present study was to assess the effect of water disinfection by-products on the risk of most common specific birth defects. We conducted a population-based cross-sectional study of 396,049 Taiwanese births in 2001-2003 using information from the Birth Registry and Waterworks Registry. We compared the risk of eleven most common specific defects in four disinfection by-product exposure categories based on the levels of total trihalomethanes (TTHMs) representing high (TTHMs 20+ ug/L), medium (TTHMs 10-19 ug/L), low exposure (TTHMs 5-9 ug/L), and 0-4 ug/L as the reference category. In addition, we conducted a meta-analysis of the results from the present and previous studies focusing on the same birth defects.

Maternal Exposure to Water Disinfection By-products During Gestation and Risk of Hypospadias

Luben, T.J., Nuckols, J.R., Mosley, B.S., Hobbs, C., Reif, J.S., Occupational and Environmental Medicine, June 2008

The use of chlorine for water disinfection results in the formation of numerous contaminants called disinfection by-products (DBPs), which may be associated with birth defects, including urinary tract defects. We used Arkansas birth records (1998-2002) to conduct a population-based case-control study investigating the relationship between hypospadias and two classes of DBPs, trihalomethanes (THM) and haloacetic acids (HAA). We utilised monitoring data, spline regression and geographical information systems (GIS) to link daily concentrations of these DBPs from 263 water utilities to 320 cases and 614 controls. We calculated ORs for hypospadias and exposure to DBPs between 6 and 16 weeks’ gestation, and conducted subset analyses for exposure from ingestion, and metrics incorporating consumption, showering and bathing. We found no increase in risk when women in the highest tertiles of exposure were compared to those in the lowest for any DBP. When ingestion alone was used to assess exposure among a subset of 40 cases and 243 controls, the intermediate tertiles of exposure to total THM and the five most common HAA had ORs of 2.11 (95% CI 0.89 to 5.00) and 2.45 (95% CI 1.06 to 5.67), respectively, compared to women with no exposure. When exposure to total THM from consumption, showering and bathing exposures was evaluated, we found an OR of 1.96 (95% CI 0.65 to 6.42) for the highest tertile of exposure and weak evidence of a dose-response relationship. Our results provide little evidence for a positive relationship between DBP exposure during gestation and an increased risk of hypospadias but emphasize the necessity of including individual-level data when assessing exposure to DBPs.

Formation of N-Nitrosamines from Eleven Disinfection Treatments of Seven Different Surface Waters

Zhao, Y.-Y., et al., Environmental Science & Technology, May 2008

Formation of nine N-nitrosamines has been investigated when seven different source waters representing various qualities were each treated with eleven bench-scale disinfection processes, without addition of nitrosamine precursors. These disinfection treatments included chlorine (OCl-) chloramine (NH2Cl), chlorine dioxide (ClO2), ozone (O3), ultraviolet (UV), advanced oxidation processes (AOP), and combinations. The total organic carbon (TOC) of the seven source waters ranged from 2 to 24 mg L-1. The disinfected water samples and the untreated source waters were analyzed for nine nitrosamines using a solid phase extraction and liquid chromatography-tandem mass spectrometry method. Prior to any treatment, N-nitrosodimethylamine (NDMA) was detected ranging from 0 to 53 ng L-1 in six of the seven source waters, and its concentrations increased in the disinfected water samples (0 – 118 ng L-1). N-nitrosodiethylamine (NDEA), N-nitrosomorpholine (NMor), and N-nitrosodiphenylamine (NDPhA) were also identified in some of the disinfected water samples. NDPhA (0.2- 0.6 ng L-1) was formed after disinfection with OCl-, NH2Cl, O3, and MPUV/OCl-. NMEA was produced with OCl- and MPUV/OCl-, and NMor formation was associated with O3. In addition, UV treatment alone degraded NDMA; however, UV/OCl- and AOP/OCl- treatments produced higher amounts of NDMA compared to UV and AOP alone, respectively. These results suggest that UV degradation or AOP oxidation treatment may provide a source of NDMA precursors. This study demonstrates that environmental concentrations and mixtures of unknown nitrosamine precursors in source waters can form NDMA and other nitrosamines.

N,N-Dimethylsulfamide as Precursor for N-Nitrosodimethylamine (NDMA) Formation upon Ozonation and its Fate During Drinking Water Treatment

Schmidt, C.K., Brauch, H.-J., Environmental Science & Technology, April 2008

Application and microbial degradation of the fungicide tolylfluanide gives rise to a new decomposition product named N,N-dimethylsulfamide (DMS). In Germany, DMS was found in groundwaters and surface waters with typical concentrations in the range of 100-1000 ng/L and 50-90 ng/L, respectively. Laboratory-scale and field investigations concerning its fate during drinking water treatment showed that DMS cannot be removed via riverbank filtration, activated carbon filtration, flocculation, and oxidation or disinfection procedures based on hydrogen peroxide, potassium permanganate, chlorine dioxide, or UV irradiation. Even nanofiltration does not provide a sufficient removal efficiency. During ozonation about 30-50% of DMS are converted to the carcinogenic N-nitrosodimethylamine (NDMA). The NDMA being formed is biodegradable and can at least partially be removed by subsequent biologically active drinking water treatment steps including sand or activated carbon filtration. Disinfection with hypochlorous acid converts DMS to so far unknown degradation products but not to NDMA or 1,1-dimethylhydrazine (UDMH).

 

Risk of Birth Defects in Australian Communities with High Brominated Disinfection By-product Levels

Chisholm, K., et al., Environmental Health Perspective, April 2008

By international standards, water supplies in Perth, Western Australia, contain high trihalomethane (THM) levels, particularly the brominated forms. Geographic variability in these levels provided an opportunity to examine cross-city spatial relationships between THM exposure and rates of birth defects (BDs).Our goal was to examine BD rates by exposure to THMs with a highly brominated fraction in metropolitan locations in Perth, Western Australia. We collected water samples from 47 separate locations and analyzed them for total and individual THM concentrations (micrograms per liter), including separation into brominated forms. We classified collection areas by total THM (TTHM) concentration: low (< 60 microg/L), medium (> 60 to < 130 microg/L), and high (> or = 130 microg/L). We also obtained deidentified registry-based data on total births and BDs (2000-2004 inclusive) from post codes corresponding to water sample collection sites and used binomial logistic regression to compare the frequency of BDs aggregately and separately for the TTHM exposure groups, adjusting for maternal age and socioeconomic status. Total THMs ranged from 36 to 190 microg/L. A high proportion of the THMs were brominated (on average, 92%). Women living in high-TTHM areas showed an increased risk of any BD [odds ratio (OR) = 1.22; 95% confidence interval (CI), 1.01-1.48] and for the major category of any cardiovascular BD (OR = 1.62; 95% CI, 1.04-2.51), compared with women living in low-TTHM areas. Brominated forms constituted the significant fraction of THMs in all areas. Small but statistically significant increases in risks of BDs were associated with residence in areas with high THMs.

EPA – FACTOIDS: Drinking Water and Ground Water Statistics for 2007

U.S. Environmental Protection Agency, March, 2008

There are approximately 156,000 public drinking water systems in the United States. Each of these systems regularly supplies drinking water to at least 25 people or 15 service connections. Beyond their common purpose, the 156,000 systems vary widely. The following tables group water systems into categories that show their similarities and differences. For example, the first table shows that most people in the US (286 million) get their water from a community water system. There are approximately 52,000 community water systems, but just eight percent of those systems (4,048) serve 82 percent of the people. The second table shows that more water systems have groundwater than surface water as a source–but more people drink from a surface water system. Other tables break down these national numbers by state, territory, and EPA region.

This package also contains figures on the types and locations of underground injection control wells. EPA and states regulate the placement and operation of these wells to ensure that they do not threaten underground sources of drinking water. The underground injection control program statistics are based on separate reporting from the states to EPA. The drinking water system statistics on the following pages are taken from the Safe Drinking Water Information System/Federal version (SDWIS/Fed). SDWIS/Fed is the U.S. Environmental Protection Agency’s official record of public drinking water systems, their violations of state and EPA regulations, and enforcement actions taken by EPA or states as a result of those violations. EPA maintains the database using information collected and submitted by the states. Notice: Compliance statistics are based on violations reported by states to the EPA Safe Drinking Water Information System. EPA is aware of inaccuracies and underreporting of some data in this system. We are working with the states to improve the quality of the data. Read an analysis of SDWIS/Fed data quality and get more information and additional drinking water data tables.

Human Health Risk Assessment of Chlorinated Disinfection By-products in Drinking Water Using a Probabilistic Approach

Hamidin, N., Yu, Q.J., Connell, D.W., Water Research, March 2008

The presence of chlorinated disinfection by-products (DBPs) in drinking water is a public health issue, due to their possible adverse health effects on humans. To gauge the risk of chlorinated DBPs on human health, a risk assessment of chloroform (trichloromethane (TCM)), bromodichloromethane (BDCM), dibromochloromethane (DBCM), bromoform (tribromomethane (TBM)), dichloroacetic acid (DCAA) and trichloroacetic acid (TCAA) in drinking water was carried out using probabilistic techniques. Literature data on exposure concentrations from more than 15 different countries and adverse health effects on test animals as well as human epidemiological studies were used. The risk assessment showed no overlap between the highest human exposure dose (EXP(D)) and the lowest human equivalent dose (HED) from animal test data, for TCM, BDCM, DBCM, TBM, DCAA and TCAA. All the HED values were approximately 10(4)-10(5) times higher than the 95th percentiles of EXP(D). However, from the human epidemiology data, there was a positive overlap between the highest EXP(D) and the lifetime average daily doses (LADD(H)) for TCM, BDCM, DCAA and TCAA. This suggests that there are possible adverse health risks such as a small increased incidence of cancers in males and developmental effects on infants. However, the epidemiological data comprised several risk factors and exposure classification levels which may affect the overall results.

Drinking Water Disinfection By-Products and Time to Pregnancy

Maclehose, R.F., Savitz, D.A., Herring, A.H., Hartmann, K.E., Singer, P.C., Weinberg, H.S., Epidemiology, March 2008

Laboratory evidence suggests tap water disinfection by-products (DBPs) could have an effect very early in pregnancy, typically before clinical detectability. Undetected early losses would be expected to increase the reported number of cycles to clinical pregnancy. We investigated the association between specific DBPs (trihalomethanes, haloacetic acids, brominated-trihalomethanes, brominated-haloacetic acids, total organic halides, and bromodichloromethane) and time to pregnancy among women who enrolled in a study of drinking water and reproductive outcomes. We quantified exposure to DBPs through concentrations in tap water, quantity ingested through drinking, quantity inhaled or absorbed while showering or bathing, and total integrated exposure. The effect of DBPs on time to pregnancy was estimated using a discrete time hazard model. Overall, we found no evidence of an increased time to pregnancy among women who were exposed to higher levels of DBPs. A modestly decreased time to pregnancy (ie, increased fecundability) was seen among those exposed to the highest level of ingested DBPs, but not for tap water concentration, the amount absorbed while showering or bathing, or the integrated exposure. Our findings extend those of a recently published study suggesting a lack of association between DBPs and pregnancy loss.

Risk of waterborne illness via drinking water in the United States

Reynolds, K.A., Mena, K.D., Gerba, C.P., Reviews of Environmental Contamination & Toxicology, January 2008

Outbreaks of disease attributable to drinking water are not common in the U.S., but they do still occur and can lead to serious acute, chronic, or sometimes fatal health consequences, particularly in sensitive and immunocompromised populations. From 1971 to 2002, there were 764 documented waterborne outbreaks associated with drinking water, resulting in 575,457 cases of illness and 79 deaths (Blackburn et al. 2004; Calderon 2004); however, the true impact of disease is estimated to be much higher. If properly applied, current protocols in municipal water treatment are effective at eliminating pathogens from water. However, inadequate, interrupted, or intermittent treatment has repeatedly been associated with waterborne disease outbreaks. Contamination is not evenly distributed but rather affected by the number of pathogens in the source water, the age of the distribution system, the quality of the delivered water, and climatic events that can tax treatment plant operations. Private water supplies are not regulated by the USEPA and are generally not treated or monitored, although very few of the municipal systems involved in documented outbreaks exceeded the USEPA’s total coliform standard in the preceding 12 mon (Craun et al. 2002). We provide here estimates of waterborne infection and illness risks in the U.S. based on the total number of water systems, source water type, and total populations exposed. Furthermore, we evaluated all possible illnesses associated with the microbial infection and not just gastroenteritis. Our results indicate that 10.7 M infections/yr and 5.4 M illnesses/yr occur in populations served by community groundwater systems; 2.2 M infections/yr and 1.1 M illnesses/yr occur in noncommunity groundwater systems; and 26.0 M infections/yr and 13.0 M illnesses/yr occur in municipal surface water systems. The total estimated number of waterborne illnesses/yr in the U.S. is therefore estimated to be 19.5 M/yr. Others have recently estimated waterborne illness rates of 12M cases/yr (Colford et al. 2006) and 16 M cases/yr (Messner et al. 2006), yet our estimate considers all health outcomes associated with exposure to pathogens in drinking water rather than only gastrointestinal illness. Drinking water outbreaks exemplify known breaches in municipal water treatment and distribution processes and the failure of regulatory requirements to ensure water that is free of human pathogens. Water purification technologies applied at the point-of-use (POU) can be effective for limiting the effects of source water contamination, treatment plant inadequacies, minor intrusions in the distribution system, or deliberate posttreatment acts (i.e., bioterrorism). Epidemiological studies are conflicting on the benefits of POU water treatment. One prospective intervention study found that consumers of reverse-osmosis (POU) filtered water had 20%-35% less gastrointestinal illnesses than those consuming regular tap water, with an excess of 14% of illness due to contaminants introduced in the distribution system (Payment 1991, 1997). Two other studies using randomized, blinded, controlled trials determined that the risks were equal among groups supplied with POU-treated water compared to untreated tap water (Hellard et al. 2001; Colford et al. 2003). For immunocompromised populations, POU water treatment devices are recommended by the CDC and USEPA as one treatment option for reducing risks of Cryptosporidium and other types of infectious agents transmitted by drinking water. Other populations, including those experiencing “normal” life stages such as pregnancy, or those very young or very old, might also benefit from the utilization of additional water treatment options beyond the current multibarrier approach of municipal water treatment.

Massive Microbiological Groundwater Contamination Associated with a Waterborne Outbreak in Lake Erie, South Bass Island, Ohio

Fong, T.-T., et al., Environmental Health Perspectives, June 2007

A groundwater-associated outbreak affected approximately 1,450 residents and visitors of South Bass Island, Ohio, between July and September 2004. To examine the microbiological quality of groundwater wells located on South Bass Island, we sampled 16 wells that provide potable water to public water systems 15–21 September 2004. We tested groundwater wells for fecal indicators, enteric viruses and bacteria, and protozoa (Cryptosporidium and Giardia). The hydrodynamics of Lake Erie were examined to explore the possible surface water–groundwater interactions. All wells were positive for both total coliform and Escherichia coli. Seven wells tested positive for enterococci and Arcobacter (an emerging bacterial pathogen), and F+-specific coliphage was present in four wells. Three wells were positive for all three bacterial indicators, coliphages, and Arcobacter; adenovirus DNA was recovered from two of these wells. We found a cluster of the most contaminated wells at the southeast side of the island. Conclusions: Massive groundwater contamination on the island was likely caused by transport of microbiological contaminants from wastewater treatment facilities and septic tanks to the lake and the subsurface, after extreme precipitation events in May–July 2004. This likely raised the water table, saturated the subsurface, and along with very strong Lake Erie currents on 24 July, forced a surge in water levels and rapid surface water–groundwater interchange throughout the island. Landsat images showed massive influx of organic material and turbidity surrounding the island before the peak of the outbreak. These combinations of factors and information can be used to examine vulnerabilities in other coastal systems. Both wastewater and drinking water issues are now being addressed by the Ohio Environmental Protection Agency and the Ohio Department of Health.

Analysis of Compliance and Characterization of Violations of the Total Coliform Rule

U.S. Environmental Protection Agency, April 2007

Total coliforms have long been used in drinking water regulations as an indicator of the adequacy of water treatment and the integrity of the distribution system. Total coliforms are a group of closely related bacteria that are generally harmless. In drinking water systems, total coliforms react to treatment in a manner similar to most bacterial pathogens and many viral pathogens. Thus, the presence of total coliforms in the distribution system can indicate that the system in also vulnerable to the presence of pathogens in the system. (EPA, June 2001, page 7) Total coliforms are the indicators used in the existing Total Coliform Rule (TCR). EPA is undertaking “a rulemaking process to initiate possible revisions to the TCR. As part of this process, EPA believes it may be appropriate to include this rulemaking in a wider effort to review and address broader issues associated with drinking water distribution systems.” (see Federal Register 68 FR 19030 and 68 FR 42907). Since the promulgation of the TCR, EPA has received stakeholder feedback suggesting modifications to the TCR to reduce the implementation burden. The purpose of this paper is to provide information on the number and frequency of violations of the TCR and to further characterize the frequency with which different types and sizes of systems incur violations. Although EPA explores some statistical testing in this paper, the paper concentrates on presenting the data, as it is, in SDWIS/FED. Information on these frequencies will be useful in supporting several EPA initiatives, particularly the effort to review and possibly revise the TCR. This paper has been undertaken as part of the review of the TCR.

 

Drowning in Disinfection Byproducts? Assessing Swimming Pool Water

DeMarini, D.M., et al., Environmental Science & Technology, January 2007

Disinfection is mandatory for swimming pools: public pools are usually disinfected by gaseous chlorine or sodium hypochlorite and cartridge filters; home pools typically use stabilized chlorine. These methods produce a variety of disinfection byproducts (DBPs), such as trihalomethanes (THMs), which are regulated carcinogenic DBPs in drinking water that have been detected in the blood and breath of swimmers and of nonswimmers at indoor pools. Also produced are halogenated acetic acids (HAAs) and haloketones, which irritate the eyes, skin, and mucous membranes; trichloramine, which is linked with swimming-pool-associated asthma; and halogenated derivatives of UV sun screens, some of which show endocrine effects. Precursors of DBPs include human body substances, chemicals used in cosmetics and sun screens, and natural organic matter. Analytical research has focused also on the identification of an additional portion of unknown DBPs using gas chromatography (GC)/mass spectrometry (MS) and liquid chromatography (LC)/MS/MS with derivatization. Children swimmers have an increased risk of developing asthma and infections of the respiratory tract and ear. A 1.6-2.0-fold increased risk for bladder cancer has been associated with swimming or showering/bathing with chlorinated water. Bladder cancer risk from THM exposure (all routes combined) was greatest among those with the GSTT1-1 gene. This suggests a mechanism involving distribution of THMs to the bladder by dermal/inhalation exposure and activation there by GSTT1-1 to mutagens. DBPs may be reduced by engineering and behavioral means, such as applying new oxidation and filtration methods, reducing bromide and iodide in the source water, increasing air circulation in indoor pools, and assuring the cleanliness of swimmers. The positive health effects gained by swimming can be increased by reducing the potential adverse health risks.

An approach for developing a national estimate of waterborne disease due to drinking water and a national estimate model application

Messner, M., et al., Journal of Water and Health,  04.suppl 2, July 2006

In this paper, the US Environmental Protection Agency (EPA) presents an approach and a national estimate of drinking water related endemic acute gastrointestinal illness (AGI) that uses information from epidemiologic studies. There have been a limited number of epidemiologic studies that have measured waterborne disease occurrence in the United States. For this analysis, we assume that certain unknown incidence of AGI in each public drinking water system is due to drinking water and that a statistical distribution of the different incidence rates for the population served by each system can be estimated to inform a mean national estimate of AGI illness due to drinking water. Data from public water systems suggest that the incidence rate of AGI due to drinking water may vary by several orders of magnitude. In addition, data from epidemiologic studies show AGI incidence due to drinking water ranging from essentially none (or less than the study detection level) to a rate of 0.26 cases per person-year. Considering these two perspectives collectively, and associated uncertainties, EPA has developed an analytical approach and model for generating a national estimate of annual AGI illness due to drinking water. EPA developed a national estimate of waterborne disease to address, in part, the 1996 Safe Drinking Water Act Amendments. The national estimate uses best available science, but also recognizes gaps in the data to support some of the model assumptions and uncertainties in the estimate. Based on the model presented, EPA estimates a mean incidence of AGI attributable to drinking water of 0.06 cases per year (with a 95% credible interval of 0.02–0.12). The mean estimate represents approximately 8.5% of cases of AGI illness due to all causes among the population served by community water systems. The estimated incidence translates to 16.4 million cases/year among the same population. The estimate illustrates the potential usefulness and challenges of the approach, and provides a focus for discussions of data needs and future study designs. Areas of major uncertainty that currently limit the usefulness of the approach are discussed in the context of the estimate analysis.

Tap Water Linked to Increase in Bladder Cancer

Reynolds, K.A., Water Conditioning & Purification, July 2006

As water treatment professionals, maybe you’ve been alerted to news stories suggesting a connection between tap water consumption and bladder cancer, but are these headlines true or just media hype? Although the most recently reported association of tap water consumption with bladder cancer is indeed based on numerous epidemiological studies with an international scope, all scientific research must be carefully evaluated; not just in terms of the data found, but also for the information possibly missed. The study that has everyone talking again about tap water consumption and its relationship to bladder cancer was published in the International Journal of Cancer (April 2006). Looking at data from six epidemiological studies, conducted in five countries worldwide (Canada, Finland, France, Italy and two in the United States), a significant association was found between tap water consumption and bladder cancer among men. The risk increased with consumption of greater volumes, suggesting that carcinogenic chemicals in tap water were responsible for the increased risk. While the information presented appears to be sound, it is important to understand the limitations of the study approach so that the data can be appropriately analyzed with respect to public health significance.

Despite a gender bias and inconsistent reports in the historical literature, this study seems to have sturdy legs to stand on or to at least justify continued research. As mentioned earlier, epidemiology is not a very sensitive science and is complicated by unknown confounders. In addition, this study provides no evidence as to what specific factors related to tap water are causing an increase in cancer, where other drinking water sources (i.e., bottled water) show no association. Water is clearly a heterogeneous mix of contaminants, with vast geographical and temporal fluctuations. Little is known about the combined effects of multiple contaminants found in drinking water, thus a study of single contaminants and their association with cancer risks would not provide a complete picture of overall exposures.

Volatile Organic Compounds in the Nation’s Drinking-Water Supply Wells – What Findings May Mean to Human Health

U.S. Geological Survey, June 2006

When volatile organic compounds (VOCs) are detected in samples from drinking-water supply wells, it is important to understand what these results may mean to human health. As a first step toward understanding VOC occurrence in the context of human health, a screening-level assessment was conducted by comparing VOC concentrations to human-health benchmarks. One sample from each of 3,497 domestic and public wells was analyzed for 55 VOCs; samples were collected prior to treatment or blending. At least one VOC was detected in 623 well samples (about 18 percent of all well samples) at a threshold of 0.2 part per billion. Eight of the 55 VOCs had concentrations greater than human-health benchmarks in 45 well samples (about 1 percent of all well samples); these concentrations may be of potential human-health concern if the water were to be ingested without treatment for many years. VOC concentrations were less than human-health benchmarks in most well samples with VOC detections, indicating that adverse effects are unlikely to occur, even if water with such concentrations were to be ingested over a lifetime. Seventeen VOCs may warrant further investigation because their concentrations were greater than, or approached, human-health benchmarks.

An Approach for Developing a National Estimate Of Waterborne Disease Due to Drinking Water and a National Estimate Model Application

Michael Messner, Susan Shaw, Stig Regli, Ken Rotert, Valerie Blank and Jeff Soller, Journal of Water and Health, 2006;04.suppl2:201-40

In this paper, the US Environmental Protection Agency (EPA) presents an approach and a national estimate of drinking water related endemic acute gastrointestinal illness (AGI) that uses information from epidemiologic studies. There have been a limited number of epidemiologic studies that have measured waterborne disease occurrence in the United States. For this analysis, we assume that certain unknown incidence of AGI in each public drinking water system is due to drinking water and that a statistical distribution of the different incidence rates for the population served by each system can be estimated to inform a mean national estimate of AGI illness due to drinking water. Data from public water systems suggest that the incidence rate of AGI due to drinking water may vary by several orders of magnitude. In addition, data from epidemiologic studies show AGI incidence due to drinking water ranging from essentially none (or less than the study detection level) to a rate of 0.26 cases per person-year. Considering these two perspectives collectively, and associated uncertainties, EPA has developed an analytical approach and model for generating a national estimate of annual AGI illness due to drinking water. EPA developed a national estimate of waterborne disease to address, in part, the 1996 Safe Drinking Water Act Amendments. The national estimate uses best available science, but also recognizes gaps in the data to support some of the model assumptions and uncertainties in the estimate. Based on the model presented, EPA estimates a mean incidence of AGI attributable to drinking water of 0.06 cases per year (with a 95% credible interval of 0.02–0.12). The mean estimate represents approximately 8.5% of cases of AGI illness due to all causes among the population served by community water systems. The estimated incidence translates to 16.4 million cases/year among the same population. The estimate illustrates the potential usefulness and challenges of the approach, and provides a focus for discussions of data needs and future study designs. Areas of major uncertainty that currently limit the usefulness of the approach are discussed in the context of the estimate analysis.

Analysis of Bromate and Bromide in Blood

Quinones, O., Snyder, S.A., Cotruvo, J.A., Fisher, J.W., Toxicology, April 2006

Bromate is a regulated disinfection byproduct primarily associated with the ozonation of water containing bromide, but also is a byproduct of hypochlorite used to disinfect water. To study the pharmacokinetics of bromate, it is necessary to develop a robust and sensitive analytical method for the identification and quantitation of bromate in blood. A critical issue is the extent to which bromate is degraded presystemically and in blood at low (environmentally relevant) doses of ingested bromate as it is delivered to target tissue. A simple isolation procedure was developed using blood plasma spiked with various levels of bromate and bromide. Blood proteins and lipids were precipitated from plasma using acetonitrile. The resulting extracts were analyzed by ion-chromatography with inductively-coupled plasma mass spectrometry (IC-ICP/MS), with a method reporting limit of 5 ng/mL plasma for both bromate and bromide. Plasma samples purchased commercially were spiked with bromate and stored up to 7 days. Over the 7 day storage period, bromate decay remained under 20% for two spike doses. Decay studies in plasma samples from spiked blood drawn from live rats showed significant bromate decay within short periods of time preceding sample freezing, although samples which were spiked, centrifuged and frozen immediately after drawing yielded excellent analytical recoveries.

Research Strategy for Developing Key Information on Bromate’s Mode of Action

Bull, R.J. and Cotruvo, J.A., Toxicology, April 2006

Bromate is produced when ozone is used to treat waters that contain trace amounts of bromide ion. It is also a contaminant of hypochlorite solutions produced by electrolysis of salt that contains bromide. Both ozone and hypochlorite are extensively used to disinfect drinking water, a process that is credited with reducing the incidence of waterborne infections diseases around the world. In studies on experimental animals, bromate has been consistently demonstrated to induce cancer, although there is evidence of substantial species differences in sensitivity (rat > mouse > hamster). There are no data to indicate bromate is carcinogenic in humans. An issue that is critical to the continued use of ozone as a disinfectant for drinking water in bromide-containing waters depends heavily on whether current predictions of carcinogenic risk based on carcinogenic responses in male rats treated with bromate are accurate at the much lower exposure levels of humans. Thiol-dependent oxidative damage to guanine in DNA is a plausible mode of action for bromate-induced cancer. However, other mechanisms may contribute to the response, including the accumulation of α2u-globulin in the kidney of the male rat. To provide direction to institutions that have an interest in clarifying the toxicological risks that bromate in drinking water might pose, a workshop funded by the Awwa Research Foundation was convened to lay out a research strategy that, if implemented, could clarify this important public health issue. The technical issues that underlie the deliberations of the workshop are provided in a series of technical papers. The present manuscript summarizes the conclusions of the workgroup with respect to the type and timing of research that should be conducted. The research approach is outlined in four distinct phases that lay out alternative directions as the research plan is implemented. Phase I is designed to quantify pre-systemic degradation, absorption, distribution, and metabolism of bromate and to associate these with key events for the induction of cancer and develop an initial pharmacokinetic (PK) model based on preliminary studies. Phase II will be implemented if it appears that there is a linear relationship between external dose and key event responses and is designed to gather carcinogenesis data in female rats in the absence of α2u-globulin-induced nephropathy which the workgroup concluded was a probable contributor to the responses observed in the male rats for which detailed dose–response data were collected. If the key events and external dosimetry are found not to be linear in Phase I, Phase III is initiated with a screening study of the auditory toxicity of bromate to determine if it is likely to be exacerbated by chronic exposure. If this occurs, auditory toxicity will be further evaluated in Phase IV. If auditory toxicity is determined unlikely to occur, an alternative chronic study in female rats to the one identified in Phase II will be implemented to include exposure in utero. This was recommended to address the possibility that the fetus may be more susceptible. One of the three options are to be implemented in Phase IV depending upon whether preliminary data indicated that chronic auditory toxicity, reproductive and/or developmental toxicities, or a combination of these outcomes is necessary to characterize the toxicology of low dose exposures to bromate. Each phase of the research will be accompanied by further development of pharmacokinetic models to guide collection of appropriate data to meet the needs of the more sophisticated studies. It is suggested that a Bayesian approach be utilized to develop a final risk model based upon measurement of prior observations from the Phase I studies and the set of posterior observations that would be obtained from whichever chronic study is conducted.

  • Bromate;
  • Research to improve risk assessment;
  • Drinking water

Experimental Results from the Reaction of Bromate Ion with Synthetic and Real Gastric Juices

Keith, J.D., Pacey, G.E., Cotruvo, J.A., Gordon,G., Toxicology, February 2006

This study was designed to identify and quantify the effects of reducing agents on the rate of bromate ion reduction in real and synthetic gastric juice. This could be the first element in the sequence of a pharmacokinetic description of the fate of bromate ion entering the organism, being metabolized, and subsequently being tracked through the system to the target cell or eliminated. Synthetic gastric juice containing H+ and Cl did exhibit reduced bromate ion levels, but at a rate that was too slow for a significant amount of bromate to be reduced under typical stomach retention time conditions. The reaction orders for Cl and H+ were 1.50 and 2.0, respectively. Addition of the reducing agents hydrogen sulfide (which was shown to be present and quantified in real gastric juice), glutathione, and/or cysteine increased the rate of bromate ion loss. All of the reactions showed significant pH effects. Half-lives as short as 2 min were measured for bromate ion reduction in 0.17 M H+ and Cl and 10−4 M H2S. Therefore, the lifetime of bromate ion in solutions containing typical gastric juice concentrations of H+, Cl, and H2S is 20–30 min. This rate should result in as much as a 99% reduction of bromate ion during its residence in the stomach. Bromate ion reduction in real gastric juice occurred at a rapid rate. A comparison of real and synthetic gastric juice containing H+, Cl, cysteine, glutathione, and hydrogen sulfide showed that the component most responsible for the considerable decrease of the concentration of bromate ion in the stomach is hydrogen sulfide.

  • Bromate;
  • Gastric juice;
  • Ion chromatography;
  • Hydrogen sulfide

Bottled Water Production in the United States: How Much Ground Water is Actually Being Used?

Keith N. Eshelman, Ph.D. Associate Professor, University of Maryland, Center for Environmental Studies, May 2005

A comprehensive, quantitative survey of bottled water producers in the U.S. that reveals data collected on bottled water production, specifically production from ground water, the primary source of bottled water.Relative to other uses of ground water, bottled water production was found to be a de minimus user of ground water.

Analysis of the February 1999 Natural Resources Defense Council Report on Bottled Water

Drinking Water Research Foundation, 1999

In February 1999, the Natural Resources Defense Council (NRDC) issued a report entitled “Bottled Water: Pure Drink or Pure Hype?” in which numerous wrong allegations against bottled water are raised. This document provides an extensive analysis and rebuttal of NRDC’s conclusions, highlighting the various mistakes and wrong allegations made by NRDC.