Energy and sports drinks in children and adolescents

Principal author(s)

Catherine M Pound, Becky Blair; Canadian Paediatric Society, Nutrition and Gastroenterology Committee

Abstract

Sports drinks and caffeinated energy drinks (CEDs) are commonly consumed by youth. Both sports drinks and CEDs pose potential risks for the health of children and adolescents and may contribute to obesity. Sports drinks are generally unnecessary for children engaged in routine or play-based physical activity. CEDs may affect children and adolescents more than adults because they weigh less and thus experience greater exposure to stimulant ingredients per kilogram of body weight. Paediatricians need to recognize and educate patients and families on the differences between sport drinks and CEDs. Screening for the consumption of CEDs, especially when mixed with alcohol, should be done routinely. The combination of CEDs and alcohol may be a marker for higher risk of substance use or abuse and for other health-compromising behaviours.

Keywords: AlcoholCaffeineCEDsEnergy drinksSports drinks

Hydration status of community-dwelling seniors.

Abstract: Dehydration is the most common fluid or electrolyte disorder among older persons. This study was designed to examine the hydration status of community-dwelling seniors.

Hydration in the Aging

Conclusion: With aging, body water stores decrease, thirst sensation is disturbed and kidneys are less able to concentrate urine, putting the elderly at increased risk of dehydration.  More…

Evidence-based nutrition education:Elderly hydration issues

Dehydration is a serious, yet modifiable clinical problem for older adults.
The purpose of this study was to evaluate and compare the impact of two hydration
education methods, brochure and individual counseling, on clinical, cognitive, behavioral,…

Oral Hydration in Older Adults: Greater awareness is needed in preventing, recognizing, and treating dehydration.

OVERVIEW: Maintaining adequate fluid balance is an essential component of health at every stage of life. Age-related changes make older adults more vulnerable to shifts in water balance that can result in over-hydration or, more frequently, dehydration. This article reviews age-related changes, risk factors, assessment measures, and nursing interventions for dehydration.

 

Significant racial, ethnic, income disparities in hydration found among U.S. adults

Nearly a third of U.S. adults are not hydrated enough, and poorer adults as well as Black and Hispanic adults are at higher risk for poor hydration than wealthier and white adults, according to a new study from Harvard T.H. Chan School of Public Health.

Lack of access to clean, safe drinking water—as highlighted by recent water crises in communities such as Flint, Michigan—may be one of the main reasons for the disparities, the authors suggested.

The study appeared online July 20, 2017 in the American Journal of Public Health.

Effect of increased water intake on plasma copeptin in healthy adults

Published online: June 3 2017 European Journal of Nutrition (IF 3.239).  Guillaume Lemetais · Olle Melander · Mariacristina Vecchio · Jeanne H. Bottin ·
Sofa Enhörning · Erica T. Perrier

Abstract

PURPOSE:

Inter-individual variation in median plasma copeptin is associated with incident type 2 diabetes mellitus, progression of chronic kidney disease, and cardiovascular events. In this study, we examined whether 24-h urine osmolality was associated with plasma copeptin and whether increasing daily water intake could impact circulating plasma copeptin.

METHODS:

This trial was a prospective study conducted at a single investigating center. Eighty-two healthy adults (age 23.6 ± 2.9 years, BMI 22.2 ± 1.5 kg/m2, 50% female) were stratified based upon habitual daily fluid intake volumes: arm A (50-80% of EFSA dietary reference values), arm B (81-120%), and arm C (121-200%). Following a baseline visit, arms A and B increased their drinking water intake to match arm C for a period of 6 consecutive weeks.

RESULTS:

At baseline, plasma copeptin was positively and significantly associated with 24-h urine osmolality (p = 0.002) and 24-h urine specific gravity (p = 0.003) but not with plasma osmolality (p = 0.18), 24-h urine creatinine (p = 0.09), and total fluid intake (p = 0.52). Over the 6-week follow-up, copeptin decreased significantly from 5.18 (3.3;7.4) to 3.90 (2.7;5.7) pmol/L (p = 0.012), while urine osmolality and urine specific gravity decreased from 591 ± 206 to 364 ± 117 mOsm/kg (p < 0.001) and from 1.016 ± 0.005 to 1.010 ± 0.004 (p < 0.001), respectively.

CONCLUSIONS:

At baseline, circulating levels of copeptin were positively associated with 24-h urine concentration in healthy young subjects with various fluid intakes. Moreover, this study shows, for the first time, that increased water intake over 6 weeks results in an attenuation of circulating copeptin.

CLINICAL TRIAL REGISTRATION NUMBER:

NCT02044679.

KEYWORDS:

Copeptin; Fluid intake; Hydration; Urine osmolality; Water intake; drinking water

Effect of increased water intake on plasma copeptin in healthy adults

 

Effective Immediately: Healthcare Facilities Required to Reduce Legionellosis Risks from Tap Water

Published July 2017

By Kelly A. Reynolds, MSPH, PhD

If you follow On Tap frequently, you know that the bacterium, Legionella, has been a repeated topic in recent years. Once again, Legionella is at the forefront of discussions due to continuing waterborne outbreaks and new directives in healthcare facilities for prevention. On June 2, the Department of Health and Human Services, Centers for Medicare and Medicaid Services (CMS) issued a memo that will undoubtedly expand the awareness of Legionella risks and further drive the implementation of preventative approaches.

The Unintended Consequences of Changes in Beverage Options and the Removal of Bottled Water on a University Campus

Elizabeth R. Berman BS, and Rachel K. Johnson PhD, MPH, RD –

Accepted: January 16, 2015
Published Online: June 05, 2015

Objectives. We investigated how the removal of bottled water along with a minimum healthy beverage requirement affected the purchasing behavior, healthiness of beverage choices, and consumption of calories and added sugars of university campus consumers.

Methods. With shipment data as a proxy, we estimated bottled beverage consumption over 3 consecutive semesters: baseline (spring 2012), when a 30% healthy beverage ratio was enacted (fall 2012), and when bottled water was removed (spring 2013) at the University of Vermont. We assessed changes in number and type of beverages and per capita calories, total sugars, and added sugars shipped.

Results. Per capita shipments of bottles, calories, sugars, and added sugars increased significantly when bottled water was removed. Shipments of healthy beverages declined significantly, whereas shipments of less healthy beverages increased significantly. As bottled water sales dropped to zero, sales of sugar-free beverages and sugar-sweetened beverages increased.

Conclusions. The bottled water ban did not reduce the number of bottles entering the waste stream from the university campus, the ultimate goal of the ban. With the removal of bottled water, consumers increased their consumption of less healthy bottled beverages.

Nationwide reconnaissance of contaminants of emerging concern in source and treated drinking waters of the United States

Glassmeyer, S.T., et al., Science of The Total Environment, 581-582:909-922, March 2017

When chemical or microbial contaminants are assessed for potential effect or possible regulation in ambient and drinking waters, a critical first step is determining if the contaminants occur and if they are at concentrations that may cause human or ecological health concerns. To this end, source and treated drinking water samples from 29 drinking water treatment plants (DWTPs) were analyzed as part of a two-phase study to determine whether chemical and microbial constituents, many of which are considered contaminants of emerging concern, were detectable in the waters. Of the 84 chemicals monitored in the 9 Phase I DWTPs, 27 were detected at least once in the source water, and 21 were detected at least once in treated drinking water. In Phase II, which was a broader and more comprehensive assessment, 247 chemical and microbial analytes were measured in 25 DWTPs, with 148 detected at least once in the source water, and 121 detected at least once in the treated drinking water. The frequency of detection was often related to the analyte’s contaminant class, as pharmaceuticals and anthropogenic waste indicators tended to be infrequently detected and more easily removed during treatment, while per and polyfluoroalkyl substances and inorganic constituents were both more frequently detected and, overall, more resistant to treatment. The data collected as part of this project will be used to help inform evaluation of unregulated contaminants in surface water, groundwater, and drinking water.

Characterizing pharmaceutical, personal care product, and hormone contamination in a karst aquifer of southwestern Illinois, USA, using water quality and stream flow parameters

Dodgen, L.K., et.al., Science of the Total Environment, 578:281-289, February 2017

Karst aquifers are drinking water sources for 25% of the global population. However, the unique geology of karst areas facilitates rapid transfer of surficial chemicals to groundwater, potentially contaminating drinking water. Contamination of karst aquifers by nitrate, chloride, and bacteria have been previously observed, but little knowledge is available on the presence of contaminants of emerging concern (CECs), such as pharmaceuticals. Over a 17-month period, 58 water samples were collected from 13 sites in the Salem Plateau, a karst region in southwestern Illinois, United States. Water was analyzed for 12 pharmaceutical and personal care products (PPCPs), 7 natural and synthetic hormones, and 49 typical water quality parameters (e.g., nutrients and bacteria). Hormones were detected in only 23% of samples, with concentrations of 2.2–9.1 ng/L. In contrast, PPCPs were quantified in 89% of groundwater samples. The two most commonly detected PPCPs were the antimicrobial triclocarban, in 81% of samples, and the cardiovascular drug gemfibrozil, in 57%. Analytical results were combined with data of local stream flow, weather, and land use to 1) characterize the extent of aquifer contamination by CECs, 2) cluster sites with similar PPCP contamination profiles, and 3) develop models to describe PPCP contamination. Median detection in karst groundwater was 3 PPCPs at a summed concentration of 4.6 ng/L. Sites clustered into 3 subsets with unique contamination models. PPCP contamination in Cluster I sites was related to stream height, manganese, boron, and heterotrophic bacteria. Cluster II sites were characterized by groundwater temperature, specific conductivity, sodium, and calcium. Cluster III sites were characterized by dissolved oxygen and barium. Across all sites, no single or small set of water quality factors was significantly predictive of PPCP contamination, although gemfibrozil concentrations were strongly related to the sum of PPCPs in karst groundwater.

A decision analysis framework for estimating the potential hazards for drinking water resources of chemicals used in hydraulic fracturing fluids

Yost, E.E., Science of the Total Environment, 574:1544-1558, January 2017

Despite growing concerns over the potential for hydraulic fracturing to impact drinking water resources, there are limited data available to identify chemicals used in hydraulic fracturing fluids that may pose public health concerns. In an effort to explore these potential hazards, a multi-criteria decision analysis (MCDA) framework was employed to analyze and rank selected subsets of these chemicals by integrating data on toxicity, frequency of use, and physicochemical properties that describe transport in water. Data used in this analysis were obtained from publicly available databases compiled by the United States Environmental Protection Agency (EPA) as part of a larger study on the potential impacts of hydraulic fracturing on drinking water. Starting with nationwide hydraulic fracturing chemical usage data from EPA’s analysis of the FracFocus Chemical Disclosure Registry 1.0, MCDAs were performed on chemicals that had either noncancer toxicity values (n = 37) or cancer-specific toxicity values (n = 10). The noncancer MCDA was then repeated for subsets of chemicals reported in three representative states (Texas, n = 31; Pennsylvania, n = 18; and North Dakota, n = 20). Within each MCDA, chemicals received scores based on relative toxicity, relative frequency of use, and physicochemical properties (mobility in water, volatility, persistence). Results show a relative ranking of these chemicals based on hazard potential, and provide preliminary insight into chemicals that may be more likely than others to impact drinking water resources. Comparison of nationwide versus state-specific analyses indicates regional differences in the chemicals that may be of more concern to drinking water resources, although many chemicals were commonly used and received similar overall hazard rankings. Several chemicals highlighted by these MCDAs have been reported in groundwater near areas of hydraulic fracturing activity. This approach is intended as a preliminary analysis, and represents one possible method for integrating data to explore potential public health impacts.

Do estrogenic compounds in drinking water migrating from plastic pipe distribution system pose adverse effects to human? An analysis of scientific literature

Liu, Z., et al., Environmental Science and Pollution Research, 24(2):2126-2134, January 2017

With the widespread application of plastic pipes in drinking water distribution system, the effects of various leachable organic chemicals have been investigated and their occurrence in drinking water supplies is monitored. Most studies focus on the odor problems these substances may cause. This study investigates the potential endocrine disrupting effects of the migrating compound 2,4-di-tert-butylphenol (2,4-d-t-BP). The summarized results show that the migration of 2,4-d-t-BP from plastic pipes could result in chronic exposure and the migration levels varied greatly among different plastic pipe materials and manufacturing brands. Based on estrogen equivalent (EEQ), the migrating levels of the leachable compound 2,4-d-t-BP in most plastic pipes were relative low. However, the EEQ levels in drinking water migrating from four out of 15 pipes may pose significant adverse effects. With the increasingly strict requirements on regulation of drinking water quality, these results indicate that some drinking water transported with plastic pipes may not be safe for human consumption due to the occurrence of 2,4-d-t-BP. Moreover, 2,4-d-t-BP is not the only plastic pipe-migrating estrogenic compound, other compounds such as 2-tert-butylphenol (2-t-BP), 4-tert-butylphenol (4-t-BP), and others may also be leachable from plastic pipes.

A national reconnaissance of trace organic compounds (TOCs) in United States lotic ecosystems

Bernot, M.J., et al., Science of the Total Environment, 572:422-433, December 2016

We collaborated with 26 groups from universities across the United States to sample 42 sites for 33 trace organic compounds (TOCs) in water and sediments of lotic ecosystems. Our goals were 1) to further develop a national database of TOC abundance in United States lotic ecosystems that can be a foundation for future research and management, and 2) to identify factors related to compound abundance. Trace organic compounds were found in 93% of water samples and 56% of sediment samples. Dissolved concentrations were 10–1000 × higher relative to sediment concentrations. The ten most common compounds in water samples with detection frequency and maximum concentration were sucralose (87.5%, 12,000 ng/L), caffeine (77.5%, 420 ng/L), sulfamethoxazole (70%, 340 ng/L), cotinine (65%, 130 ng/L), venlafaxine (65%, 1800 ng/L), carbamazepine (62.5%, 320 ng/L), triclosan (55%, 6800 ng/L), azithromycin (15%, 970 ng/L), diphenylhydramine (40%, 350 ng/L), and desvenlafaxine (35%, 4600 ng/L). In sediment, the most common compounds were venlafaxine (32.5%, 19 ng/g), diphenhydramine (25%, 41 ng/g), azithromycin (15%, 11 ng/g), fluoxetine (12.5%, 29 ng/g) and sucralose (12.5%, 16 ng/g). Refractory compounds such as sucralose may be good indicators of TOC contamination in lotic ecosystems, as there was a correlation between dissolved sucralose concentrations and with the total number of compounds detected in water. Discharge and human demographic (population size) characteristics were not good predictors of compound abundance in water samples. This study further confirms the ubiquity of TOCs in lotic ecosystems. Although concentrations measured rarely approached acute aquatic-life criteria, the chronic effects, bioaccumulative potential, or potential mixture effects of multiple compounds are relatively unknown.

Atrazine in Kentucky drinking water: intermethod comparison of U.S. environmental protection agency analytical methods 507 and 508.1

Suhl, J., et al., Journal of Environmental Health, 79(5):E1-E6, December 2016

This study examines the analytical methods used to test drinking water for atrazine along with the seasonal variation of atrazine in drinking water. Samples from 117 counties throughout Kentucky from January 2000 to December 2008 were analyzed. Methods 507 and 508.1 were compared using the Mann-Whitney U test. Median values of these methods were similar (p = .7421). To examine seasonal variation, data from each year and from the entire period were analyzed using one-way ANOVA; pairwise multiple comparisons were made for years with significant differences. All the years except 2001, 2005, 2006, and 2007 had significantly different atrazine concentrations between seasons. The Seasonal Kendall Test for Trend was used to identify trends in atrazine over time. Yearly means ranged from 0.000043 mg/L (± 0.000011 mg/L) to 0.000995 mg/L (± 0.000510 mg/L). The highest levels were observed during spring in most years. A significant (p = .000092) decreasing trend of -7.6 x 10-6 mg/L/year was found. Decreasing trends were also present in all five regions of the state during this period. This study illustrates the need for changes in sampling methodology used today, so that effective exposure assessments can be conducted to study the public’s exposure to atrazine in drinking water.

Widespread copper and lead contamination of household drinking water, New South Wales, Australia

Harvey, P.J., et.al., Environmental Research, 151:275-285, November 2016

This study examines arsenic, copper, lead and manganese drinking water contamination at the domestic consumer’s kitchen tap in homes of New South Wales, Australia. Analysis of 212 first draw drinking water samples shows that almost 100% and 56% of samples contain detectable concentrations of copper and lead, respectively. Of these detectable concentrations, copper exceeds Australian Drinking Water Guidelines (ADWG) in 5% of samples and lead in 8%. By contrast, no samples contained arsenic and manganese water concentrations in excess of the ADWG. Analysis of household plumbing fittings (taps and connecting pipework) show that these are a significant source of drinking water lead contamination. Water lead concentrations derived for plumbing components range from 108µg/L to 1440µg/L (n=28, mean – 328µg/L, median – 225µg/L). Analysis of kitchen tap fittings demonstrates these are a primary source of drinking water lead contamination (n=9, mean – 63.4µg/L, median – 59.0µg/L). The results of this study demonstrate that along with other potential sources of contamination in households, plumbing products that contain detectable lead up to 2.84% are contributing to contamination of household drinking water. Given that both copper and lead are known to cause significant health detriments, products for use in contact with drinking water should be manufactured free from copper and lead.

Pb-Sr isotopic and geochemical constraints on sources and processes of lead contamination in well waters and soil from former fruit orchards, Pennsylvania, USA: A legacy of anthropogenic activities

Ayuso, R.A., and Foley, N.K., Journal of Geochemical Exploration, 170:125-147, November 2016

Isotopic discrimination can be an effective tool in establishing a direct link between sources of Pb contamination and the presence of anomalously high concentrations of Pb in waters, soils, and organisms. Residential wells supplying water containing up to 1600 ppb Pb to houses built on the former Mohr orchards commercial site, near Allentown, PA, were evaluated to discern anthropogenic from geogenic sources. Pb (n = 144) and Sr (n = 40) isotopic data and REE (n = 29) data were determined for waters from residential wells, test wells (drilled for this study), and surface waters from pond and creeks. Local soils, sediments, bedrock, Zn-Pb mineralization and coal were also analyzed (n = 94), together with locally used Pb-As pesticide (n = 5). Waters from residential and test wells show overlapping values of 206Pb/207Pb, 208Pb/207Pb and 87Sr/86Sr. Larger negative Ce anomalies (Ce/Ce*) distinguish residential wells from test wells. Results show that residential and test well waters, sediments from residential water filters in water tanks, and surface waters display broad linear trends in Pb isotope plots. Pb isotope data for soils, bedrock, and pesticides have contrasting ranges and overlapping trends. Contributions of Pb from soils to residential well waters are limited and implicated primarily in wells having shallow water-bearing zones and carrying high sediment contents. Pb isotope data for residential wells, test wells, and surface waters show substantial overlap with Pb data reflecting anthropogenic actions (e.g., burning fossil fuels, industrial and urban processing activities). Limited contributions of Pb from bedrock, soils, and pesticides are evident. High Pb concentrations in the residential waters are likely related to sediment build up in residential water tanks. Redox reactions, triggered by influx of groundwater via wells into the residential water systems and leading to subtle changes in pH, are implicated in precipitation of Fe oxyhydroxides, oxidative scavenging of Ce(IV), and desorption and release of Pb into the residential water systems. The Pb isotope features in the residences and the region are best interpreted as reflecting a legacy of industrial Pb present in underlying aquifers that currently supply the drinking water wells.

Malodorous volatile organic sulfur compounds: Sources, sinks and significance in inland waters

Watson, S.B, and Jüttner, F., Critical Reviews in Microbiology, 43(2):210-237, November 2016

Volatile Organic Sulfur Compounds (VOSCs) are instrumental in global S-cycling and greenhouse gas production. VOSCs occur across a diversity of inland waters, and with widespread eutrophication and climate change, are increasingly linked with malodours in organic-rich waterbodies and drinking-water supplies. Compared with marine systems, the role of VOSCs in biogeochemical processes is far less well characterized for inland waters, and often involves different physicochemical and biological processes. This review provides an updated synthesis of VOSCs in inland waters, focusing on compounds known to cause malodours. We examine the major limnological and biochemical processes involved in the formation and degradation of alkylthiols, dialkylsulfides, dialkylpolysulfides, and other organosulfur compounds under different oxygen, salinity and mixing regimes, and key phototropic and heterotrophic microbial producers and degraders (bacteria, cyanobacteria, and algae) in these environs. The data show VOSC levels which vary significantly, sometimes far exceeding human odor thresholds, generated by a diversity of biota, biochemical pathways, enzymes and precursors. We also draw attention to major issues in sampling and analytical artifacts which bias and preclude comparisons among studies, and highlight significant knowledge gaps that need addressing with careful, appropriate methods to provide a more robust understanding of the potential effects of continued global development.

To Buy or not to Buy? Perceptions of Bottled Drinking Water in Australia and New Zealand

Ragusa, A.T., and Crampton, A., Human Ecology, 44(5):565-576, October 2016

In the midst of popular and scientific debates about its desirability, safety and environmental sustainability, bottled water is forecast to become the most consumed packaged beverage globally (Feliciano 2014) and fastest growth sector in Australia (Johnson 2007). Manufacturers attribute increasing sales to convenience and health benefits rather than intensive advertising/marketing campaigns. Our sociological investigation of drinking water perceptions generally, and bottled water specifically, using data from 192 face-to-face interviews with Australians and New Zealanders, revealed 77 % thought about the quality of their drinking water; 64 % noted specific adverse issues, and 82 % reported concerns with their tap water. However, although 64 % drink bottled water, just 28 % believe it is better than tap water and 63 % consider it a waste of money. Only 21 % drink it for ‘convenience’ and consumption patterns vary significantly by gender, with men and younger generations purchasing the most bottled water. Qualitative analysis refutes stereotypes associating bottled water with a status symbol or lifestyle choice; participants largely mistrust water companies; just 13 % describe bottled water as a ‘trusted’ product, even when consumed for its taste or convenience, and 13 % label it a ‘bad’ plastic product detrimental to the environment or public health, thus lending support for institutional and policy trends banning bottled water.

Occurrence of DBPs in Drinking Water of European Regions for Epidemiology Studies

Krasner, S.J., et.al., American Water Works Association Journal, 108(10):501-512, October 2016

A three-year study was conducted on the occurrence of disinfection by-products (DBPs) – trihalomethanes (THMs), haloacetic acids (HAAs), and haloacetonitriles – in drinking water of regions of Europe where epidemiology studies were being carried out. Thirteen systems in six countries (i.e., Italy, France, Greece, Lithuania, Spain, United Kingdom) were sampled. Typically chlorinated DBPs dominated. However, in most of Catalonia (Spain) and in Heraklion (Greece), brominated DBPs dominated. The degree of bromine incorporation into the DBP classes was in general similar among them. This is important, as brominated DBPs are a greater health concern. In parts of Catalonia, the reported levels of tribromoacetic acid were higher than in other parts of the world. In some regions, the levels of HAAs tended to be peaked in concentration in a different time period than when the levels of THMs peaked. In most epidemiology studies, THMs are used as a surrogate for other halogenated DBPs. This study provides exposure assessment information for epidemiology studies.

Origin of Hexavalent Chromium in Drinking Water Wells from the Piedmont Aquifers of North Carolina

Vengosh, A., et.al., Environmental Science & Techonology Letters, 3(12):409-414, October 2016

Hexavalent chromium [Cr(VI)] is a known pulmonary carcinogen. Recent detection of Cr(VI) in drinking water wells in North Carolina has raised public concern about contamination of drinking water wells by nearby coal ash ponds. Here we report, for the first time, the prevalence of Cr and Cr(VI) in drinking water wells from the Piedmont region of central North Carolina, combined with a geochemical analysis to determine the source of the elevated Cr(VI) levels. We show that Cr(VI) is the predominant species of dissolved Cr in groundwater and elevated levels of Cr and Cr(VI) are found in wells located both near and far (>30 km) from coal ash ponds. The geochemical characteristics, including the overall chemistry, boron to chromium ratios, and strontium isotope (87Sr/86Sr) variations in groundwater with elevated Cr(IV) levels, are different from those of coal ash leachates. Alternatively, the groundwater chemistry and Sr isotope variations are consistent with water–rock interactions as the major source for Cr(VI) in groundwater. Our results indicate that Cr(VI) is most likely naturally occurring and ubiquitous in groundwater from the Piedmont region in the eastern United States, which could pose health risks to residents in the region who consume well water as a major drinking water source.

The precautionary principle and chemicals management: The example of perfluoroalkyl acids in groundwater

Cousins, I.T., et.al., Environment International, 94:331-340, September 2016

Already in the late 1990s microgram-per-liter levels of perfluorooctane sulfonate (PFOS) were measured in water samples from areas where fire-fighting foams were used or spilled. Despite these early warnings, the problems of groundwater, and thus drinking water, contaminated with perfluoroalkyl and polyfluoroalkyl substances (PFASs) including PFOS are only beginning to be addressed. It is clear that this PFAS contamination is poorly reversible and that the societal costs of clean-up will be high. This inability to reverse exposure in a reasonable timeframe is a major motivation for application of the precautionary principle in chemicals management. We conclude that exposure can be poorly reversible; 1) due to slow elimination kinetics in organisms, or 2) due to poorly reversible environmental contamination that leads to continuous exposure. In the second case, which is relevant for contaminated groundwater, the reversibility of exposure is not related to the magnitude of a chemical’s bioaccumulation potential. We argue therefore that all PFASs entering groundwater, irrespective of their perfluoroalkyl chain length and bioaccumulation potential, will result in poorly reversible exposures and risks as well as further clean-up costs for society. To protect groundwater resources for future generations, society should consider a precautionary approach to chemicals management and prevent the use and release of highly persistent and mobile chemicals such as PFASs.

Drinking water lead regulations: impact on the brass value chain

Estelle, A.A., Materials Science and Technology, 32(17):1763-1770, August 2016

A detailed review of regulations restricting the use of lead in potable water systems is provided in several regions including the United States (U.S.), Canada, the European Union (E.U.) and Japan to assess the impact on the brass value chain. Covered topics include: chronology of regulations, governing bodies, compliance requirements, enforcement mechanisms and other aspects relevant to metal suppliers, original equipment manufacturers, designers, specifiers, end-users and recyclers of brass. The development and use of lead-free brass alloys and how these materials have impacted manufacturing and recycling processes is also addressed.

Temporal variation in groundwater quality in the Permian Basin of Texas, a region of increasing unconventional oil and gas development

Hildenbrand, Z.L., et.al., Science of the Total Environment, 562:906-913, August 2016

The recent expansion of natural gas and oil extraction using unconventional oil and gas development (UD) practices such as horizontal drilling and hydraulic fracturing has raised questions about the potential for environmental impacts. Prior research has focused on evaluations of air and water quality in particular regions without explicitly considering temporal variation; thus, little is known about the potential effects of UD activity on the environment over longer periods of time. Here, we present an assessment of private well water quality in an area of increasing UD activity over a period of 13 months. We analyzed samples from 42 private water wells located in three contiguous counties on the Eastern Shelf of the Permian Basin in Texas. This area has experienced a rise in UD activity in the last few years, and we analyzed samples in four separate time points to assess variation in groundwater quality over time as UD activities increased. We monitored general water quality parameters as well as several compounds used in UD activities. We found that some constituents remained stable over time, but others experienced significant variation over the period of study. Notable findings include significant changes in total organic carbon and pH along with ephemeral detections of ethanol, bromide, and dichloromethane after the initial sampling phase. These data provide insight into the potentially transient nature of compounds associated with groundwater contamination in areas experiencing UD activity.

Detection of Poly- and Perfluoroalkyl Substances (PFASs) in U.S. Drinking Water Linked to Industrial Sites, Military Fire Training Areas, and Wastewater Treatment Plants

Andrews, D.Q., et.al., Environmental Science & Technologies Letters, August 2016

Drinking water contamination with poly- and perfluoroalkyl substances (PFASs) poses risks to the developmental, immune, metabolic, and endocrine health of consumers. We present a spatial analysis of 2013–2015 national drinking water PFAS concentrations from the U.S. Environmental Protection Agency’s (US EPA) third Unregulated Contaminant Monitoring Rule (UCMR3) program. The number of industrial sites that manufacture or use these compounds, the number of military fire training areas, and the number of wastewater treatment plants are all significant predictors of PFAS detection frequencies and concentrations in public water supplies. Among samples with detectable PFAS levels, each additional military site within a watershed’s eight-digit hydrologic unit is associated with a 20% increase in PFHxS, a 10% increase in both PFHpA and PFOA, and a 35% increase in PFOS. The number of civilian airports with personnel trained in the use of aqueous film-forming foams is significantly associated with the detection of PFASs above the minimal reporting level. We find drinking water supplies for 6 million U.S. residents exceed US EPA’s lifetime health advisory (70 ng/L) for PFOS and PFOA. Lower analytical reporting limits and additional sampling of smaller utilities serving <10000 individuals and private wells would greatly assist in further identifying PFAS contamination sources.

Cyto- and genotoxic profile of groundwater used as drinking water supply before and after disinfection

Pellacani, C., et.al., Journal of Water and Health, 14(6):901-913, July 2016

The assessment of the toxicological properties of raw groundwater may be useful to predict the type and quality of tap water. Contaminants in groundwater are known to be able to affect the disinfection process, resulting in the formation of substances that are cytotoxic and/or genotoxic. Though the European directive (98/83/EC, which establishes maximum levels for contaminants in raw water (RW)) provides threshold levels for acute exposition to toxic compounds, the law does not take into account chronic exposure at low doses of pollutants, present in complex mixture. The purpose of this study was to evaluate the cyto- and genotoxic load in groundwater of two water treatment plants in Northern Italy. Water samples induced cytotoxic effects, mainly observed when human cells were treated with RW. Moreover, results indicated that the disinfection process reduced cell toxicity, independent of the biocidal used. The induction of genotoxic effects was found, in particular, when the Micronucleous assay was carried out on raw groundwater. These results suggest that it is important to include bio-toxicological assays as additional parameters in water quality monitoring programs, as their use would allow the evaluation of the potential risk of groundwater for humans.

Emerging contaminant uncertainties and policy: The chicken or the egg conundrum

Naidu, R., et.al., Chemosphere, 154:385-390, July 2016

Best practice in regulating contaminants of emerging concern (CEC) must involve the integration of science and policy, be defensible and accepted by diverse stakeholders. Key elements of CEC frameworks include identification and prioritisation of emerging contaminants, evaluation of health and environmental impacts from key matrices such as soil, groundwater, surface waters and sediment, assessments of available data, methods and technologies (and limitations), and mechanisms to take cognisance of diverse interests. This paper discusses one of the few frameworks designed for emerging contaminants, the Minnesota Department of Health (MDH) Drinking Water Contaminants of Emerging Concern (CEC) program. Further review of mechanisms for CECs in other jurisdictions reveals that there is only a small number of regulatory and guidance regimes globally. There is also merit in a formal mechanism for the global exchange of knowledge and outcomes associated with CECs of global interest.

Emerging contaminants in the environment: Risk-based analysis for better management.

Naidu, R., et.al., Chemosphere, 154:350-357, July 2016

Emerging contaminants (ECs) are chemicals of a synthetic origin or deriving from a natural source that has recently been discovered and for which environmental or public health risks are yet to be established. This is due to limited available information on their interaction and toxicological impacts on receptors. Several types of ECs exist such as antibiotics, pesticides, pharmaceuticals, personal care products, effluents, certain naturally occurring contaminants and more recently nanomaterials. ECs may derive from a known source, for example released directly to the aquatic environment from direct discharges such as those from wastewater treatment plants. Although in most instances the direct source cannot be identified, ECs have been detected in virtually every country’s natural environment and as a consequence they represent a global problem. There is very limited information on the fate and transport of ECs in the environment and their toxicological impact. This lack of information can be attributed to limited financial resources and the lack of analytical techniques for detecting their effects on ecosystems and human health on their own or as mixture. We do not know how ECs interact with each other or various contaminants. This paper presents an overview of existing knowledge on ECs, their fate and transport and a risk-based analysis for ECs management and complementary strategies.

Potential corrosivity of untreated groundwater in the United States

Belitz, K., et.al., U.S. Geological Survey Scientific Investigations Report 2016-5092, July 2016

Corrosive groundwater, if untreated, can dissolve lead and other metals from pipes and other components in water distribution systems. Two indicators of potential corrosivity—the Langelier Saturation Index (LSI) and the Potential to Promote Galvanic Corrosion (PPGC)—were used to identify which areas in the United States might be more susceptible to elevated concentrations of metals in household drinking water and which areas might be less susceptible. On the basis of the LSI, about one-third of the samples collected from about 21,000 groundwater sites are classified as potentially corrosive. On the basis of the PPGC, about two-thirds of the samples collected from about 27,000 groundwater sites are classified as moderate PPGC, and about one-tenth as high PPGC. Potentially corrosive groundwater occurs in all 50 states and the District of Columbia. National maps have been prepared to identify the occurrence of potentially corrosive groundwater in the 50 states and the District of Columbia. Eleven states and the District of Columbia were classified as having a very high prevalence of potentially corrosive groundwater, 14 states as having a high prevalence of potentially corrosive groundwater, 19 states as having a moderate prevalence of potentially corrosive groundwater, and 6 states as having a low prevalence of potentially corrosive groundwater. These findings have the greatest implication for people dependent on untreated groundwater for drinking water, such as the 44 million people that are self-supplied and depend on domestic wells or springs for their water supply.

Inadequate Hydration, BMI, and Obesity Among US Adults: NHANES 2009–2012

Chang, T., MD, MPH, MS, et.al., Annals of Family Medicine, 14(4):320-324, July 2016

Improving hydration is a strategy commonly used by clinicians to prevent overeating with the goal of promoting a healthy weight among patients. The relationship between weight status and hydration, however, is unclear. Our objective was to assess the relationship between inadequate hydration and BMI and inadequate hydration and obesity among adults in the United States. Our study used a nationally representative sample from the National Health and Nutrition Examination Survey (NHANES) 2009 to 2012, and included adults aged 18 to 64 years. The primary outcome of interest was body mass index (BMI), measured in continuous values and also categorized as obese (BMI ≥30) or not (BMI <30). Individuals with urine osmolality values of 800 mOsm/ kg or greater were considered to be inadequately hydrated. Linear and logistic regressions were performed with continuous BMI and obesity status as the outcomes, respectively. Models were adjusted for known confounders including age, race/ethnicity, sex, and income-to-poverty ratio. In this nationally representative sample (n=9,528; weighted n=193.7 million), 50.8% were women, 64.5% were non-Hispanic white, and the mean age was 41 years. Mean urine osmolality was 631.4 mOsm/kg (SD=236.2 mOsm/kg); 32.6% of the sample was inadequately hydrated. In adjusted models, adults who were inadequately hydrated had higher BMIs (1.32 kg/m2; 95% CI, 0.85-1.79; P <.001) and higher odds of being obese (OR=1.59; 95% CI, 1.35- 1.88; P <.001) compared with hydrated adults. We found a significant association between inadequate hydration and elevated BMI and inadequate hydration and obesity, even after controlling for confounders. This relationship has not previously been shown on a population level and suggests that water, an essential nutrient, may deserve greater focus in weight management research and clinical strategies.

Human Health Risk Assessment of Chromium in Drinking Water: A Case Study of Sukinda Chromite Mine, Odisha, India

Naz, A., et.al., Exposure and Health, 8(2):253-264, June 2016

The present study aims to evaluate human health risk of Cr(VI) and Cr(III) via oral and dermal exposure of drinking water in groundwater samples of nearby Sukinda chromite mine. The risk assessment of each location was carried out using mathematical models as per IRIS guidelines and the input parameters were taken according to the Indian context. The concentrations of TCr and Cr(VI) were found in the range of 48.7–250.2 and 21.4–115.2 μg/l, respectively. In the maximum locations, TCr and Cr(VI) concentrations were found 2.3–6 times and 2.1–11.5 times higher, respectively, than the permissible limit as per standard statutory bodies. The total cumulative average cancer risk and non-cancer risk (Hazard Quotient) was found 2.04E−03 and 1.37 in male and 1.73E−03 and 1.16 in the female population, respectively, which indicated ‘very high’ cancer risk and ‘medium’ non-cancer risk as per USEPA guideline. Male population was found 1.2 times higher cancer and non-cancer risk than females, because of the higher water ingestion rate in male. The obtained health risk via dermal route was found 6 times lesser than the oral ingestion due to very less dermal exposure time (0.58 h/days). As a consequence, ‘high’ cancer risk also recorded in one of the locations where TCr concentration was within permissible limit which is because of the higher proportion of bioavailable Cr(VI). Sensitivity analysis of input parameters towards cancer and non-cancer risk revealed that Cr(VI) and Cr(III) concentrations were the main predominant parameters followed by exposure duration, body weight, average time, and dermal slope factor.

Vulnerability of drinking water supplies to engineered nanoparticles

Troester, M., et.al., Water Research, 96:255-279, June 2016

The production and use of engineered nanoparticles (ENPs) inevitably leads to their release into aquatic environments, with the quantities involved expected to increase significantly in the future. Concerns therefore arise over the possibility that ENPs might pose a threat to drinking water supplies. Investigations into the vulnerability of drinking water supplies to ENPs are hampered by the absence of suitable analytical methods that are capable of detecting and quantifiying ENPs in complex aqueous matrices. Analytical data concerning the presence of ENPs in drinking water supplies is therefore scarce. The eventual fate of ENPs in the natural environment and in processes that are important for drinking water production are currently being investigated through laboratory based-experiments and modelling. Although the information obtained from these studies may not, as yet, be sufficient to allow comprehensive assessment of the complete life-cycle of ENPs, it does provide a valuable starting point for predicting the significance of ENPs to drinking water supplies. This review therefore addresses the vulnerability of drinking water supplies to ENPs. The risk of ENPs entering drinking water is discussed and predicted for drinking water produced from groundwater and from surface water. Our evaluation is based on reviewing published data concerning ENP production amounts and release patterns, the occurrence and behavior of ENPs in aquatic systems relevant for drinking water supply and ENP removability in drinking water purification processes. Quantitative predictions are made based on realistic high-input case scenarios. The results of our synthesis of current knowledge suggest that the risk probability of ENPs being present in surface water resources is generally limited, but that particular local conditions may increase the probability of raw water contamination by ENPs. Drinking water extracted from porous media aquifers are not generally considered to be prone to ENP contamination. In karstic aquifers, however, there is an increased probability that if any ENPs enter the groundwater system they will reach the extraction point of a drinking water treatment plant (DWTP). The ability to remove ENPs during water treatment depends on the specific design of the treatment process. In conventional DWTPs with no flocculation step a proportion of ENPs, if present in the raw water, may reach the final drinking water. The use of ultrafiltration techniques improves drinking water safety with respect to ENP contamination.

Prevention of Childhood Lead Toxicity

Council on Environmental Health, Pediatrics, June 2016

Blood lead concentrations have decreased dramatically in US children over the past 4 decades, but too many children still live in housing with deteriorated lead-based paint and are at risk for lead exposure with resulting lead-associated cognitive impairment and behavioral problems. Evidence continues to accrue that commonly encountered blood lead concentrations, even those below 5 µg/dL (50 ppb), impair cognition; there is no identified threshold or safe level of lead in blood. From 2007 to 2010, approximately 2.6% of preschool children in the United States had a blood lead concentration ≥5 µg/dL (≥50 ppb), which represents about 535 000 US children 1 to 5 years of age. Evidence-based guidance is available for managing increased lead exposure in children, and reducing sources of lead in the environment, including lead in housing, soil, water, and consumer products, has been shown to be cost-beneficial. Primary prevention should be the focus of policy on childhood lead toxicity.

The Flint Water Crisis Confirms That U.S. Drinking Water Needs Improved Risk Management

Baum, R., et.al., Environmental Science & Technology, 50(11):5436-5437, May 2016

This article focuses on the existing public health concerns that the current regulatory system has repeatedly failed to address to protect US residents. Water system failures have been extensively analyzed, leading to a conclusion that most could have been prevented with better risk management. Recent research shows the most reported reasons to be because of time and money constraints, but these may also reflect a lack of policy priority shared by the utility and regulator. U.S. public drinking water systems are focused on meeting nationally defined regulations that target certain maximum contaminant levels (MCLs) and specific treatment techniques.

Hydration and Chronic Kidney Disease Progression: A Critical Review of the Evidence

Clark, W.F., et.al., American Journal of Nephrology, 43(4):281-292May 2016

We performed a comprehensive literature review to examine evidence on the effects of hydration on the kidney. By reducing vasopressin secretion, increasing water intake may have a beneficial effect on renal function in patients with all forms of chronic kidney disease (CKD) and in those at risk of CKD. This potential benefit may be greater when the kidney is still able to concentrate urine (high fluid intake is contraindicated in dialysis-dependent patients). Increasing water intake is a well-accepted method for preventing renal calculi, and current evidence suggests that recurrent dehydration and heat stress from extreme occupational conditions is the most probable cause of an ongoing CKD epidemic in Mesoamerica. In polycystic kidney disease (PKD), increased water intake has been shown to slow renal cyst growth in animals via direct vasopressin suppression, and pharmacologic blockade of renal vasopressin-V2 receptors has been shown to slow cyst growth in patients. However, larger clinical trials are needed to determine if supplemental water can safely slow the loss of kidney function in PKD patients.

Assessing clarity of message communication for mandated USEPA drinking water quality reports

Davy, B.M, et.al., Journal of Water and Health, 14(2):223-235, April 2016

The United States Environmental Protection Agency mandates that community water systems (CWSs), or drinking water utilities, provide annual consumer confidence reports (CCRs) reporting on water quality, compliance with regulations, source water, and consumer education. While certain report formats are prescribed, there are no criteria ensuring that consumers understand messages in these reports. To assess clarity of message, trained raters evaluated a national sample of 30 CCRs using the Centers for Disease Control Clear Communication Index (Index) indices: (1) Main Message/Call to Action; (2) Language; (3) Information Design; (4) State of the Science; (5) Behavioral Recommendations; (6) Numbers; and (7) Risk. Communication materials are considered qualifying if they achieve a 90% Index score. Overall mean score across CCRs was 50 ± 14% and none scored 90% or higher. CCRs did not differ significantly by water system size. State of the Science (3 ± 15%) and Behavioral Recommendations (77 ± 36%) indices were the lowest and highest, respectively. Only 63% of CCRs explicitly stated if the water was safe to drink according to federal and state standards and regulations. None of the CCRs had passing Index scores, signaling that CWSs are not effectively communicating with their consumers; thus, the Index can serve as an evaluation tool for CCR effectiveness and a guide to improve water quality communications.

Inactivation Kinetics and Replication Cycle Inhibition of Adenovirus by Monochloramine

Gall, A.M., et.al., Environmental Science & Technology Letters, 3.4:185-189, April 2016

Monochloramine is commonly used as a secondary disinfectant to maintain a residual in drinking water distribution systems in the United States. The mechanism by which waterborne viruses become inactivated by monochloramine remains widely unknown. A more fundamental understanding of how viruses become inactivated is necessary for better detection and control of viruses in drinking water. Human adenovirus (HAdV) is known to be the waterborne virus most resistant to monochloramine disinfection, and this study presents inactivation kinetics over a range of environmental conditions. Several steps in the HAdV replication cycle were investigated to determine which steps become inhibited by monochloramine disinfection. Interestingly, monochloramine-inactivated HAdV could bind to host cells, but genome replication and early and late mRNA transcription were inhibited. We conclude that monochloramine exposure inhibited a replication cycle event after binding but prior to early viral protein synthesis.

Determination of dimethyl selenide and dimethyl sulphide compounds causing off-flavours in bottled mineral waters

Guadayol, M., et. al., Water Research, 92 149-155; April 2016

Sales of bottled drinking water have shown a large growth during the last two decades due to the general belief that this kind of water is healthier, its flavour is better and its consumption risk is lower than that of tap water. Due to the previous points, consumers are more demanding with bottled mineral water, especially when dealing with its organoleptic properties, like taste and odour. This work studies the compounds that can generate obnoxious smells, and that consumers have described like swampy, rotten eggs, sulphurous, cooked vegetable or cabbage. Closed loop stripping analysis (CLSA) has been used as a pre-concentration method for the analysis of off-flavour compounds in water followed by identification and quantification by means of GC-MS. Several bottled water with the aforementioned smells showed the presence of volatile dimethyl selenides and dimethyl sulphides, whose concentrations ranged, respectively, from 4 to 20 ng/L and from 1 to 63 ng/L. The low odour threshold concentrations (OTCs) of both organic selenide and sulphide derivatives prove that several objectionable odours in bottled waters arise from them. Microbial loads inherent to water sources, along with some critical conditions in water processing, could contribute to the formation of these compounds. There are few studies about volatile organic compounds in bottled drinking water and, at the best of our knowledge, this is the first study reporting the presence of dimethyl selenides and dimethyl sulphides causing odour problems in bottled waters.

Multimedia exposures to arsenic and lead for children near an inactive mine tailings and smelter site

Loh, M.M., et.al., Environmental Research, Volume 146, p.331–339, April 2016

Children living near contaminated mining waste areas may have high exposures to metals from the environment. This study investigates whether exposure to arsenic and lead is higher in children in a community near a legacy mine and smelter site in Arizona compared to children in other parts of the United States and the relationship of that exposure to the site. Arsenic and lead were measured in residential soil, house dust, tap water, urine, and toenail samples from 70 children in 34 households up to 7 miles from the site. Soil and house dust were sieved, digested, and analyzed via ICP-MS. Tap water and urine were analyzed without digestion, while toenails were washed, digested and analyzed. Blood lead was analyzed by an independent, certified laboratory. Spearman correlation coefficients were calculated between each environmental media and urine and toenails for arsenic and lead. Geometric mean arsenic (standard deviation) concentrations for each matrix were: 22.1 (2.59) ppm and 12.4 (2.27) ppm for soil and house dust.

Reduction in horizontal transfer of conjugative plasmid by UV irradiation and low-level chlorination

Lin, W., et.al., Water Research, 91:331-338, March 2016

The widespread presence of antibiotic resistance genes (ARGs) and antibiotic resistant bacteria (ARB) in the drinking water system facilitates their horizontal gene transfer among microbiota. In this study, the conjugative gene transfer of RP4 plasmid after disinfection including ultraviolet (UV) irradiation and low-level chlorine treatment was investigated. It was found that both UV irradiation and low-level chlorine treatment reduced the conjugative gene transfer frequency. The transfer frequency gradually decreased from 2.75 × 10(-3) to 2.44 × 10(-5) after exposure to UV doses ranging from 5 to 20 mJ/cm(2). With higher UV dose of 50 and 100 mJ/cm(2), the transfer frequency was reduced to 1.77 × 10(-6) and 2.44 × 10(-8). The RP4 plasmid transfer frequency was not significantly affected by chlorine treatment at dosages ranging from 0.05 to 0.2 mg/l, but treatment with 0.3-0.5 mg/l chlorine induced a decrease in conjugative transfer to 4.40 × 10(-5) or below the detection limit. The mechanisms underlying these phenomena were also explored, and the results demonstrated that UV irradiation and chlorine treatment (0.3 and 0.5 mg/l) significantly reduced the viability of bacteria, thereby lowering the conjugative transfer frequency. Although the lower chlorine concentrations tested (0.05-0.2 mg/l) were not sufficient to damage the cells, exposure to these concentrations may still depress the expression of a flagellar gene (FlgC), an outer membrane porin gene (ompF), and a DNA transport-related gene (TraG). Additionally, fewer pili were scattered on the bacteria after chlorine treatment. These findings are important in assessing and controlling the risk of ARG transfer and dissemination in the drinking water system.

Water Disinfection Byproducts Induce Antibiotic Resistance-Role of Environmental Pollutants in Resistance Phenomena

Li, D., et.al., Environmental Science & Technology, 50(6):3193-3201, March 2016

The spread of antibiotic resistance represents a global threat to public health, and has been traditionally attributed to extensive antibiotic uses in clinical and agricultural applications. As a result, researchers have mostly focused on clinically relevant high-level resistance enriched by antibiotics above the minimal inhibitory concentrations (MICs). Here, we report that two common water disinfection byproducts (chlorite and iodoacetic acid) had antibiotic-like effects that led to the evolution of resistant E. coli strains under both high (near MICs) and low (sub-MIC) exposure concentrations. The subinhibitory concentrations of DBPs selected strains with resistance higher than those evolved under above-MIC exposure concentrations. In addition, whole-genome analysis revealed distinct mutations in small sets of genes known to be involved in multiple drug and drug-specific resistance, as well as in genes not yet identified to play role in antibiotic resistance. The number and identities of genetic mutations were distinct for either the high versus low sub-MIC concentrations exposure scenarios. This study provides evidence and mechanistic insight into the sub-MIC selection of antibiotic resistance by antibiotic-like environmental pollutants such as disinfection byproducts in water, which may be important contributors to the spread of global antibiotic resistance. The results from this study open an intriguing and profound question on the roles of large amount and various environmental contaminants play in selecting and spreading the antibiotics resistance in the environment.

Characterization of a Drinking Water Distribution Pipeline Terminally Colonized by Naegleria fowleri

Morgan, M.J., et.al., Environment & Technology, 50(6):2890-2898, March 2016

Free-living amoebae, such as Naegleria fowleri, Acanthamoeba spp., and Vermamoeba spp., have been identified as organisms of concern due to their role as hosts for pathogenic bacteria and as agents of human disease. In particular, N. fowleri is known to cause the disease primary amoebic meningoencephalitis (PAM) and can be found in drinking water systems in many countries. Understanding the temporal dynamics in relation to environmental and biological factors is vital for developing management tools for mitigating the risks of PAM. Characterizing drinking water systems in Western Australia with a combination of physical, chemical and biological measurements over the course of a year showed a close association of N. fowleri with free chlorine and distance from treatment over the course of a year. This information can be used to help design optimal management strategies for the control of N. fowleri in drinking-water-distribution systems.

Viral persistence in surface and drinking water: Suitability of PCR pre-treatment with intercalating dyes

Prevost, B., et.al., Water Research, March 2016

After many outbreaks of enteric virus associated with consumption of drinking water, the study of enteric viruses in water has increased significantly in recent years. In order to better understand the dynamics of enteric viruses in environmental water and the associated viral risk, it is necessary to estimate viral persistence in different conditions. In this study, two representative models of human enteric viruses, adenovirus 41 (AdV 41) and coxsackievirus B2 (CV-B2), were used to evaluate the persistence of enteric viruses in environmental water. The persistence of infectious particles, encapsidated genomes and free nucleic acids of AdV 41 and CV-B2 was evaluated in drinking water and surface water at different temperatures (4 °C, 20 °C and 37 °C). The infectivity of AdV 41 and CV-B2 persisted for at least 25 days, whatever the water temperature, and for more than 70 days at 4 °C and 20 °C, in both drinking and surface water. Encapsidated genomes persisted beyond 70 days, whatever the water temperature. Free nucleic acids (i.e. without capsid) also were able to persist for at least 16 days in drinking and surface water. The usefulness of a detection method based on an intercalating dye pre-treatment, which specifically targets preserved particles, was investigated for the discrimination of free and encapsidated genomes and it was compared to virus infectivity. Further, the resistance of AdV 41 and CV-B2 against two major disinfection treatments applied in drinking water plants (UV and chlorination) was evaluated. Even after the application of UV rays and chlorine at high doses (400 mJ/cm(2) and 10 mg.min/L, respectively), viral genomes were still detected with molecular biology methods. Although the intercalating dye pre-treatment had little use for the detection of the effects of UV treatment, it was useful in the case of treatment by chlorination and less than 1 log10 difference in the results was found as compared to the infectivity measurements. Finally, for the first time, the suitability of intercalating dye pre-treatment for the estimation of the quality of the water produced by treatment plants was demonstrated using samples from four drinking-water plants and two rivers. Although 55% (27/49) of drinking water samples were positive for enteric viruses using molecular detection, none of the samples were positive when the intercalating dye pre-treatment method was used. This could indicate that the viruses that were detected are not infectious.

Using flow cytometry and Bacteroidales 16S rRNA markers to study the hygienic quality of source water

Baumgartner, A., et.al., Journal für Verbraucherschutz und Lebensmittelsicherheit, 11.1:83-88, March 2016

Six source water fountains in the community of Berne, Switzerland were sampled monthly over the period of 1 year. The samples were tested for total counts by flow cytometry, and for fecal contamination by using the Bacteroidales 16S rRNA markers HF183, BacR and AllBac. The total counts varied considerably between the different fountains with a minimal value of 5115 counts/L and with a maximal count of 198,508 counts/L. The long-term patterns of total counts over 1 year were typical for each fountain. Comparison of rainfall data and data for the non-specific fecal marker AllBac was shown to be a suitable approach to highlight the vulnerability of sources to environmental influences. HF183, indicating contamination of human origin, occurred only sporadically and in insignificant amounts. Furthermore, as indicated by BacR, the studied fountains showed no evidence of contamination by ruminant feces. Further work is suggested in order to establish threshold values for molecular Bacteroidales markers, which could in future replace the currently used criteria for fecal indicator bacteria.

Contrasting regional and national mechanisms for predicting elevated arsenic in private wells across the United States using classification and regression trees

Frederick, L., et. al., Water Research, March 2016

Arsenic contamination in groundwater is a public health and environmental concern in the United States (U.S.) particularly where monitoring is not required under the Safe Water Drinking Act. Previous studies suggest the influence of regional mechanisms for arsenic mobilization into groundwater; however, no study has examined how influencing parameters change at a continental scale spanning multiple regions. We herein examine covariates for groundwater in the western, central and eastern U.S. regions representing mechanisms associated with arsenic concentrations exceeding the U.S. Environmental Protection Agency maximum contamination level (MCL) of 10 parts per billion (ppb). Statistically significant covariates were identified via classification and regression tree (CART) analysis, and included hydrometeorological and groundwater chemical parameters. The CART analyses were performed at two scales: national and regional; for which three physiographic regions located in the western (Payette Section and the Snake River Plain), central (Osage Plains of the Central Lowlands), and eastern (Embayed Section of the Coastal Plains) U.S. were examined. Validity of each of the three regional CART models was indicated by values >85% for the area under the receiver-operating characteristic curve. Aridity (precipitation minus potential evapotranspiration) was identified as the primary covariate associated with elevated arsenic at the national scale. At the regional scale, aridity and pH were the major covariates in the arid to semi-arid (western) region; whereas dissolved iron (taken to represent chemically reducing conditions) and pH were major covariates in the temperate (eastern) region, although additional important covariates emerged, including elevated phosphate. Analysis in the central U.S. region indicated that elevated arsenic concentrations were driven by a mixture of those observed in the western and eastern regions.

Effect of a School-Based Water Intervention on Child Body Mass Index and Obesity. – PubMed – NCBI

Schwartz, A.E., et al., JAMA Pediatrics, March 2016

We want to decrease the amount of caloric beverages consumed while simultaneously increasing water consumption in order to promote child health and decrease the prevalence of childhood obesity. This aim of this study is to estimate the impact of water jets (electrically cooled, large clear jugs with a push lever for fast dispensing) on standardized body mass index, overweight, and obesity in elementary school and middle school students. Milk purchases were explored as a potential mechanism for weight outcomes. This quasi-experimental study used a school-level database of cafeteria equipment deliveries between the 2008-2009 and 2012-2013 and included a sample of 1227 New York, New York, public elementary schools and middle schools and the 1 065 562 students within those schools. The intervention was installation of water jets in schools. Individual body mass index (BMI) was calculated for all students in the sample using annual student-level height and weight measurements collected as part of New York’s FITNESSGRAM initiative. Age- and sex-specific growth charts produced by the Centers for Disease Control and Prevention were used to categorize students as overweight and obese. The hypothesis that water jets would be associated with decreased standardized BMI, overweight, and obesity was tested using a difference-in-difference strategy, comparing outcomes for treated and nontreated students before and after the introduction of a water jet. This study included 1 065 562 students within New York City public elementary schools and middle schools. There was a significant effect of water jets on standardized BMI, such that the adoption of water jets was associated with a 0.025 (95% CI, -0.038 to -0.011) reduction of standardized BMI for boys and a 0.022 (95% CI, -0.035 to -0.008) reduction of standardized BMI for girls (P < .01). There was also a significant effect on being overweight. Water jets were associated with a 0.9 percentage point reduction (95% CI, 0.015-0.003) in the likelihood of being overweight for boys and a 0.6 percentage reduction (95% CI, 0.011-0.000) in the likelihood of being overweight for girls (P < .05). We also found a 12.3 decrease (95% CI, -19.371 to -5.204) in the number of all types of milk half-pints purchased per student per year (P < .01). Results from this study show an association between a relatively low-cost water availability intervention and decreased student weight. Milk purchases were explored as a potential mechanism. Additional research is needed to examine potential mechanisms for decreased student weight, including reduced milk taking, as well as assessing impacts on longer-term outcomes.

EHP – Use of a Cumulative Exposure Index to Estimate the Impact of Tap Water Lead Concentration on Blood Lead Levels in 1- to 5-Year-Old Children (Montréal, Canada)

Ngueta, G., Environmental Health Perspective, March 2016

Drinking water is recognized as a source of lead (Pb) exposure. However, questions remain about the impact of chronic exposure to lead-contaminated water on internal dose. Our goal was to estimate the relation between a cumulative water Pb exposure index (CWLEI) and blood Pb levels (BPb) in children 1–5 years of ages. Between 10 September 2009 and 27 March 2010, individual characteristics and water consumption data were obtained from 298 children. Venous blood samples were collected (one per child) and a total of five 1-L samples of water per home were drawn from the kitchen tap. A second round of water collection was performed between 22 June 2011 and 6 September 2011 on a subsample of houses. Pb analyses used inductively coupled plasma mass spectroscopy. Multiple linear regressions were used to estimate the association between CWLEI and BPb. Each 1-unit increase in CWLEI multiplies the expected value of BPb by 1.10 (95% CI: 1.06, 1.15) after adjustment for confounders. Mean BPb was significantly higher in children in the upper third and fourth quartiles of CWLEI (0.7–1.9 and ≥ 1.9 μg/kg of body weight) compared with the first (< 0.2 μg/kg) after adjusting for confounders (19%; 95% CI: 0, 42% and 39%; 95% CI: 15, 67%, respectively). The trends analysis yielded a p-value < 0.0001 after adjusting for confounders suggesting a dose–response relationship between percentiles of CWLEI and BPb. In children 1–5 years of age, BPb was significantly associated with water lead concentration with an increase starting at a cumulative lead exposure of ≥ 0.7 μg Pb/kg of body weight. In this age group, an increase of 1 μg/L in water lead would result in an increase of 35% of BPb after 150 days of exposure.

 

Elevated Blood Lead Levels in Children Associated With the Flint Drinking Water Crisis: A Spatial Analysis of Risk and Public Health Response

Hanna-Attisha, M., et. al., American Journal of Public Health, Vol. 106 no. 2, February 2016

We analyzed differences in pediatric elevated blood lead level incidence before and after Flint, Michigan, introduced a more corrosive water source into an aging water system without adequate corrosion control. We reviewed blood lead levels for children younger than 5 years before (2013) and after (2015) water source change in Greater Flint, Michigan. We assessed the percentage of elevated blood lead levels in both time periods, and identified geographical locations through spatial analysis. Incidence of elevated blood lead levels increased from 2.4% to 4.9% ( P< .05) after water source change, and neighborhoods with the highest water lead levels experienced a 6.6% increase. No significant change was seen outside the city. Geospatial analysis identified disadvantaged neighborhoods as having the greatest elevated blood lead level increases and informed response prioritization during the now-declared public health emergency. It was concluded that the percentage of children with elevated blood lead levels increased after water source change, particularly in socioeconomically disadvantaged neighborhoods. Water is a growing source of childhood lead exposure because of aging infrastructure.

Plain water consumption in relation to energy intake and diet quality among US adults, 2005–2012 – An – 2016 – Journal of Human Nutrition and Dietetics – Wiley Online Library

An, R. and McCaffrey, J., Journal of Human Nutrition and Dietetics, February 2016

The present study examined plain water consumption in relation to energy intake and diet quality among US adults. A nationally representative sample of 18 311 adults aged ≥18 years, from the National Health and Nutrition Examination Survey 2005–2012, was analysed. The first-difference estimator approach addressed confounding bias from time-invariant unobservables (e.g. eating habits, taste preferences) by using within-individual variations in diet and plain water consumption between two nonconsecutive 24-h dietary recalls. One percentage point increase in the proportion of daily plain water in total dietary water consumption was associated with a reduction in mean (95% confidence interval) daily total energy intake of 8.58 (7.87–9.29) kcal, energy intake from sugar-sweetened beverages of 1.43 (1.27–1.59) kcal, energy intake from discretionary foods of 0.88 (0.44–1.32) kcal, total fat intake of 0.21 (0.17–0.25) g, saturated fat intake of 0.07 (0.06–0.09) g, sugar intake of 0.74 (0.67–0.82) g, sodium intake of 9.80 (8.20–11.39) mg and cholesterol intake of 0.88 (0.64–1.13) g. The effects of plain water intake on diet were similar across race/ethnicity, education attainment, income level and body weight status, whereas they were larger among males and young/middle-aged adults than among females and older adults, respectively. Daily overall diet quality measured by the Healthy Eating Index-2010 was not found to be associated with the proportion of daily plain water in total dietary water consumption. It was concluded that promoting plain water intake could be a useful public health strategy for reducing energy and targeted nutrient consumption in US adults, which warrants confirmation in future controlled interventions.

DVC-FISH and PMA-qPCR techniques to assess the survival of Helicobacter pylori inside Acanthamoeba castellanii

Moreno-Mesonoro, L., et.al., Research in Microbiology, 167(1):29-34, January 2016

Free-living amoebae (FLA) are ubiquitous microorganisms commonly found in water. They can act as Trojan Horses for some amoeba-resistant bacteria (ARB). Helicobacter pylori is a pathogenic bacteria, suggested to be transmitted through water, which could belong to the ARB group. In this work, a co-culture assay of H. pylori and Acanthamoeba castellanii, one of the most common FLA, was carried out to identify the presence and survival of viable and potentially infective forms of the bacteria internalized by the amoeba. Molecular techniques including FISH, DVC-FISH, qPCR and PMA-qPCR were used to detect the presence of internalized and viable H. pylori. After 24 h in co-culture and disinfection treatment to kill extra-amoebic bacteria, viable H. pylori cells were observed inside A. castellanii. When PMA-qPCR was applied to the co-culture samples, only DNA from internalized H. pylori cells was detected, whereas qPCR amplified total DNA from the sample. By the combined DVC-FISH method, the viability of H. pylori cells in A. castellanii was observed. Both specific techniques provided evidence, for the first time, that the pathogen is able to survive chlorination treatment in occurrence with A. castellanii and could be very useful methods for performing further studies in environmental samples.

Variability in the chemistry of private drinking water supplies and the impact of domestic treatment systems on water quality

Ander, E.L., et.al., Environmental Geochemistry and Health, 38(6):1313-1332, January 2016

Tap water from 497 properties using private water supplies, in an area of metalliferous and arsenic mineralisation (Cornwall, UK), was measured to assess the extent of compliance with chemical drinking water quality standards, and how this is influenced by householder water treatment decisions. The proportion of analyses exceeding water quality standards were high, with 65 % of tap water samples exceeding one or more chemical standards. The highest exceedances for health-based standards were nitrate (11 %) and arsenic (5 %). Arsenic had a maximum observed concentration of 440 µg/L. Exceedances were also high for pH (47 %), manganese (12 %) and aluminium (7 %), for which standards are set primarily on aesthetic grounds. However, the highest observed concentrations of manganese and aluminium also exceeded relevant health-based guidelines. Significant reductions in concentrations of aluminium, cadmium, copper, lead and/or nickel were found in tap waters where households were successfully treating low-pH groundwaters, and similar adventitious results were found for arsenic and nickel where treatment was installed for iron and/or manganese removal, and successful treatment specifically to decrease tap water arsenic concentrations was observed at two properties where it was installed. However, 31 % of samples where pH treatment was reported had pH < 6.5 (the minimum value in the drinking water regulations), suggesting widespread problems with system maintenance. Other examples of ineffectual treatment are seen in failed responses post-treatment, including for nitrate. This demonstrates that even where the tap waters are considered to be treated, they may still fail one or more drinking water quality standards. We find that the degree of drinking water standard exceedances warrant further work to understand environmental controls and the location of high concentrations. We also found that residents were more willing to accept drinking water with high metal (iron and manganese) concentrations than international guidelines assume. These findings point to the need for regulators to reinforce the guidance on drinking water quality standards to private water supply users, and the benefits to long-term health of complying with these, even in areas where treated mains water is widely available.

Human exposure to thallium through tap water: A study from Valdicastello Carducci and Pietrasanta (northern Tuscany, Italy)

Campanella, B., et. al., Science of the Total Environment, January 2016

A geological study evidenced the presence of thallium (Tl) at concentrations of concern in groundwaters near Valdicastello Carducci (Tuscany, Italy). The source of contamination has been identified in the Tl-bearing pyrite ores occurring in the abandoned mining sites of the area. The strongly acidic internal waters flowing in the mining tunnels can reach exceptional Tl concentrations, up to 9000μg/L. In September 2014 Tl contamination was also found in the tap water distributed in the same area (from 2 to 10μg/L). On October 3, 2014 the local authorities imposed a Do Not Drink order to the population. Here we report the results of the exposure study carried out from October 2014 to October 2015, and aimed at quantifying Tl levels in 150 urine and 318 hair samples from the population of Valdicastello Carducci and Pietrasanta. Thallium was quantified by inductively coupled plasma – mass spectrometry (ICP-MS). Urine and hair were chosen as model matrices indicative of different time periods of exposure (short-term and long-term, respectively). Thallium values found in biological samples were correlated with Tl concentrations found in tap water in the living area of each citizen, and with his/her habits. Thallium concentration range found in hair and urine was 1-498ng/g (values in unexposed subjects 0.1-6ng/g) and 0.046-5.44μg/L (reference value for the European population 0.006μg/L), respectively. Results show that Tl levels in biological samples were significantly associated with residency in zones containing elevated water Tl levels. The kinetics of decay of Tl concentration in urine samples was also investigated. At the best of our knowledge, this is the first study on human contamination by Tl through water involving such a high number of samples.

Prevalence and characterization of extended-spectrum beta-lactamase-producing Enterobacteriaceae in spring waters

Li, S., et.al., Letters in Applied Microbiology, 61(6):544-548, December 2015

The purpose of this study was to investigate the prevalence and characterization of extended-spectrum beta-lactamases (ESBL)-producing Enterobacteriaceae from spring waters in Mountain Tai of China. ESBL-producing Enterobacteriaceae were found in four out of 50 sampled spring waters (4/50, 8·0%) and a total of 16 non-duplicate ESBL-producing Enterobacteriaceae were obtained, including 13 Escherichia coli (E. coli) and three Klebsiella pneumoniae (Kl. pneumoniae). All 16 nonduplicate ESBL-producing Enterobacteriaceae isolates harboured genes encoding CTX-M ESBLs, among which six expressed CTX-M-15, five produced CTX-M-14, three produced CTX-M-55 and two expressed CTX-M-27. Four multilocus sequence types (ST) were found and ST131 was the dominant type (8/16, 50·0%). Taken together, the contamination of ESBL-producing Enterobacteriaceae were present in spring waters of Mountain Tai. The results indicated that spring waters could become a reservoir of antibiotic resistant bacteria and contribute to the spread of antimicrobial-resistant bacteria via drinking water or food chain. In addition, wastewater discharge of restaurants or hotels may be an important contribution source of antibiotic resistant bacteria in spring waters.

Qualitative analysis of water quality deterioration and infection by Helicobacter pylori in a community with high risk of stomach cancer (Cauca, Colombia)

Acosta, C.P., et.al., Salud Colectiva, 11(4):575-590, December 2015

This study looks at aspects of the environmental health of the rural population in Timbío (Cauca, Columbia) in relation to the deterioration of water quality. The information was obtained through participatory research methods exploring the management and use of water, the sources of pollution and the perception of water quality and its relation to Helicobacter pylori infection. The results are part of the qualitative analysis of a first research phase characterizing water and sanitation problems and their relation to emerging infectious diseases as well as possible solutions, which was carried out between November 2013 and August 2014. The results of this research are discussed from an ecosystemic approach to human health, recognizing the complexity of environmental conflicts related to water resources and their impacts on the health of populations. Through the methodology used, it is possible to detect and visualize the most urgent problems as well as frequent causes of contamination of water resources so as to propose solutions within a joint agenda of multiple social actors.

Rapid bacteriophage MS2 transport in an oxic sandy aquifer in cold climate: Field experiments and modeling

Kvitsand, H.M.L., Water Resources Research, 51:9725-9745, December 2015

Virus removal during rapid transport in an unconfined, low-temperature (6°C) sand and gravel aquifer was investigated at a riverbank field site, 25 km south of Trondheim in central Norway. The data from bacteriophage MS2 inactivation and transport experiments were applied in a two-site kinetic transport model using HYDRUS-1D, to evaluate the mechanisms of virus removal and whether these mechanisms were sufficient to protect the groundwater supplies. The results demonstrated that inactivation was negligible to the overall removal and that irreversible MS2 attachment to aquifer grains, coated with iron precipitates, played a dominant role in the removal of MS2; 4.1 log units of MS2 were removed by attachment during 38 m travel distance and less than 2 days residence time. Although the total removal was high, pathways capable of allowing virus migration at rapid velocities were present in the aquifer. The risk of rapid transport of viable viruses should be recognized, particularly for water supplies without permanent disinfection.

 

Evaluation of alternative DNA extraction processes and real-time PCR for detecting Cryptosporidium parvum in drinking water

Kimble, G.H., Water Science and Technology: Water Supply, 15.6:1295-1303, December 2015

USEPA Method 1623 is the standard method in the United States for the detection of Cryptosporidium in water samples, but quantitative real-time polymerase chain reaction (qPCR) is an alternative technique that has been successfully used to detect Cryptosporidium in aqueous matrices. This study examined various modifications to a commercial nucleic acid extraction procedure in order to enhance PCR detection sensitivity for Cryptosporidium. An alternative DNA extraction buffer allowed for qPCR detection at lower seed levels than a commercial extraction kit buffer. In addition, the use of a second spin column cycle produced significantly better detection (P = 0.031), and the volume of Tris–EDTA buffer significantly affected crossing threshold values (P= 0.001). The improved extraction procedure was evaluated using 10 L of tap water samples processed by ultrafiltration, centrifugation and immunomagnetic separation. Mean recovery for the sample processing method was determined to be 41% using microscopy and 49% by real-time PCR (P = 0.013). The results of this study demonstrate that real-time PCR can be an effective alternative for detecting and quantifying Cryptosporidium parvum in drinking water samples.

Potential applications of next generation DNA sequencing of 16S rRNA gene amplicons in microbial water quality monitoring

Vierheilig, J., et.al., Water Science and Technology, 72.11:1962-1972, December 2015

The applicability of next generation DNA sequencing (NGS) methods for water quality assessment has so far not been broadly investigated. This study set out to evaluate the potential of an NGS-based approach in a complex catchment with importance for drinking water abstraction. In this multi-compartment investigation, total bacterial communities in water, faeces, soil, and sediment samples were investigated by 454 pyrosequencing of bacterial 16S rRNA gene amplicons to assess the capabilities of this NGS method for (i) the development and evaluation of environmental molecular diagnostics, (ii) direct screening of the bulk bacterial communities, and (iii) the detection of faecal pollution in water. Results indicate that NGS methods can highlight potential target populations for diagnostics and will prove useful for the evaluation of existing and the development of novel DNA-based detection methods in the field of water microbiology. The used approach allowed unveiling of dominant bacterial populations but failed to detect populations with low abundances such as faecal indicators in surface waters. In combination with metadata, NGS data will also allow the identification of drivers of bacterial community composition during water treatment and distribution, highlighting the power of this approach for monitoring of bacterial regrowth and contamination in technical systems.

Qualitative analysis of water quality deterioration and infection by Helicobacter pylori in a community with high risk of stomach cancer (Cauca, Colombia)

Acosta, C.P., et. al., Salud Colectiva, 11 (4):575-590, December 2015

This study looks at aspects of the environmental health of the rural population in Timbio (Cauca, Columbia) in relation to the deterioration of water quality. The information was obtained through participatory research methods exploring the management and use of water, the sources of pollution and the perception of water quality and its relation to Helicobacter pylori infection. The results are part of the qualitative analysis of a first research phase characterizing water and sanitation problems and their relation to emerging infectious diseases as well as possible solutions, which was carried out between November 2013 and August 2014. The results of this research are discussed from an ecosystemic approach to human health, recognizing the complexity of environmental conflicts related to water resources and their impacts on the health of populations. Through the methodology used, it is possible to detect and visualize the most urgent problems as well as frequent causes of contamination of water resources so as to propose solutions within a joint agenda of multiple social actors.

Impacts of hydraulic fracturing on water quality: a review of literature, regulatory frameworks and an analysis of information gaps

Gagnon, G.A., et.al., Environmental Reviews, 24(2):122-131, November 2015

A review of available literature and current governance approaches related to the potential impacts of hydraulic fracturing on water quality (including drinking water) was developed. The paper identifies gaps in literature and (or) current governance approaches that should be addressed to guide decision-makers in the development of appropriate regulatory regimes that will enable assessment of the impacts of hydraulic fracturing on water quality. The lack of credible and comprehensive data are shown to have been a major setback to properly investigate and monitor hydraulic fracturing activities and their potential risks on the environment and water quality. A review of current governance approaches demonstrates that some jurisdictions have implemented baseline and post-operation water quality monitoring requirements; however, there are large variations in site-specific monitoring requirements across Canada and the United States. In light of recent information, a targeted approach is suggested based on risk priorities, which can prioritize sample collection and frequency, target contaminants, and the needed duration of the sampling. The steps outlined in this review help to interface with the public concerns associated with water quality, and appropriately ensure that public health is protected through appropriate water safety planning.

Estimating Potential Increased Bladder Cancer Risk Due to Increased Bromide Concentrations in Sources of Disinfected Drinking Waters

Regli, S., et.al., Environmental Science & Technology, 49.22:13094-13102, November 2015

Public water systems are increasingly facing higher bromide levels in their source waters from anthropogenic contamination through coal-fired power plants, conventional oil and gas extraction, textile mills, and hydraulic fracturing. Climate change is likely to exacerbate this in coming years. We estimate bladder cancer risk from potential increased bromide levels in source waters of disinfecting public drinking water systems in the United States. Bladder cancer is the health end point used by the United States Environmental Protection Agency (EPA) in its benefits analysis for regulating disinfection byproducts in drinking water. We use estimated increases in the mass of the four regulated trihalomethanes (THM4) concentrations (due to increased bromide incorporation) as the surrogate disinfection byproduct (DBP) occurrence metric for informing potential bladder cancer risk. We estimate potential increased excess lifetime bladder cancer risk as a function of increased source water bromide levels. Results based on data from 201 drinking water treatment plants indicate that a bromide increase of 50 μg/L could result in a potential increase of between 10(-3) and 10(-4) excess lifetime bladder cancer risk in populations served by roughly 90% of these plants.

Extent and Impacts of Unplanned Wastewater Reuse in US Rivers

Rice, J., et.al., American Water Works Association Journal, 107.11:571-581, November 2015

A recently developed watershed-scale hydraulic model (De-facto Reuse Incidence in our Nation’s Consumptive Supply [DRINCS]) was applied to estimate municipal wastewater treatment plant (WWTP) contribution to downstream water treatment plant (WTP) influent flow. Using DRINCS and geocoded data for 14,651 WWTPs and 1,320 WTPs, the occurrence of treated municipal wastewater in drinking water supplies is geographically widespread, and its magnitude depends largely on the flow condition and size of the source river. Under average streamflow conditions in this study, the median contribution of wastewater flow to drinking water supplies was approximately 1% and increased to as much as 100% under low-flow conditions (modeled by Q95). Wastewater contributions to nutrient and emerging contaminant loading were estimated and geospatially compared with the findings of the US Environmental Protection Agency’s Unregulated Contaminant Monitoring Rule and Long Term 2 Enhanced Surface Water Treatment Rule. In turn, this analysis offers important insights into the treatment challenges facing treatment facilities across the United States.

Fountain Autopsy to Determine Lead Occurrence in Drinking Water: Journal of Environmental Engineering: (ASCE)

McIlwain, B., et al., Journal of Environmental Engineering, November 2015

Exposure to lead in drinking water poses a risk for various adverse health effects, and significant efforts have been made to monitor and eliminate lead exposure in drinking water. This study focused on the localization of lead exposure from 71 drinking water fountains in nonresidential buildings in order to determine the source of elevated lead and understand the effects of fountains associated with lead concentration in drinking water. Drinking water fountains containing lead-lined cooling tanks and brass fittings were found to release lead concentrations in excess of 10 μg/L10 μg/L, and fountains with low or infrequent usage and those with cooling tanks produced the highest concentrations (in excess of 20  μg/L20  μg/L) of lead. One particular fountain model found at several locations throughout the institution was associated with some of the highest lead concentrations measured throughout the study. This fountain was recalled in the United States, but not in Canada. This article adds to existing research demonstrating that drinking water fountains are a potentially significant and underappreciated source of lead exposure in nonresidential buildings.

Higher plain water intake is associated with lower type 2 diabetes risk: a cross-sectional study in humans

Carroll, H.A., et.al., Nutrition Research, 35(10):865-872, October 2015

The aim of this study was to investigate the relationship between plain water intake and type 2 diabetes (T2D) risk. It was hypothesized that higher plain water intake would be associated with a lower T2D risk score. One hundred thirty-eight adults from Southwest and Southeast England answered a cross-sectional online survey assessing T2D risk (using the Diabetes UK risk assessment); physical activity (using the short International Physical Activity Questionnaire); and consumption of fruits, vegetables, and beverages (using an adapted version of the Cambridge European Prospective Investigation into Cancer and Nutrition Food Frequency Questionnaire). There was a trend for differences in mean plain water intake between those stratified as having low, increased, moderate, or high risk of T2D; but these did not achieve significance (P = .084). However, plain water intake was significantly negatively correlated with T2D risk score (τ = -0°180, P = .005); and for every 240-mL cup of water consumed per day, T2D risk score was reduced by 0.72 point (range, 0-47) (B = -0.03, 95% confidence interval = -0.06 to -0.01, P = .014). The current study has provided preliminary results that are supported by theory; mechanisms need to be explored further to determine the true effect of plain water intake on disease risk. As increasing plain water intake is a simple and cost-effective dietary modification, its impact on T2D risk is important to investigate further in a randomized controlled trial. Overall, this study found that plain water intake had a significant negative correlation with T2D risk score; and regression analysis suggested that water may have a role in reducing T2D risk.

Solar Disinfection of Viruses in Polyethylene Terephthalate Bottles

Carratala, A., et.al., Applied and Environmental Microbiology, 82(1):279-288, October 2015

Solar disinfection (SODIS) of drinking water in polyethylene terephthalate (PET) bottles is a simple, efficient point-of-use technique for the inactivation of many bacterial pathogens. In contrast, the efficiency of SODIS against viruses is not well known. In this work, we studied the inactivation of bacteriophages (MS2 and ϕX174) and human viruses (echovirus 11 and adenovirus type 2) by SODIS. We conducted experiments in PET bottles exposed to (simulated) sunlight at different temperatures (15, 22, 26, and 40°C) and in water sources of diverse compositions and origins (India and Switzerland). Good inactivation of MS2 (>6-log inactivation after exposure to a total fluence of 1.34 kJ/cm(2)) was achieved in Swiss tap water at 22°C, while less-efficient inactivation was observed in Indian waters and for echovirus (1.5-log inactivation at the same fluence). The DNA viruses studied, ϕX174 and adenovirus, were resistant to SODIS, and the inactivation observed was equivalent to that occurring in the dark. High temperatures enhanced MS2 inactivation substantially; at 40°C, 3-log inactivation was achieved in Swiss tap water after exposure to a fluence of only 0.18 kJ/cm(2). Overall, our findings demonstrate that SODIS may reduce the load of single-stranded RNA (ssRNA) viruses, such as echoviruses, particularly at high temperatures and in photoreactive matrices. In contrast, complementary measures may be needed to ensure efficient inactivation during SODIS of DNA viruses resistant to oxidation.

Regulation of non-relevant metabolites of plant protection products in drinking and groundwater in the EU: Current status and way forward

Laabs, V., et.al., Regulatory Toxicology and Pharmacology, 73.1:276-286, October 2015

Non-relevant metabolites are defined in the EU regulation for plant protection product authorization and a detailed definition of non-relevant metabolites is given in an EU Commission DG Sanco (now DG SANTE – Health and Food Safety) guidance document. However, in water legislation at EU and member state level non-relevant metabolites of pesticides are either not specifically regulated or diverse threshold values are applied. Based on their inherent properties, non-relevant metabolites should be regulated based on substance-specific and toxicity-based limit values in drinking and groundwater like other anthropogenic chemicals. Yet, if a general limit value for non-relevant metabolites in drinking and groundwater is favored, an application of a Threshold of Toxicological Concern (TTC) concept for Cramer class III compounds leads to a threshold value of 4.5 μg L(-1). This general value is exemplarily shown to be protective for non-relevant metabolites, based on individual drinking water limit values derived for a set of 56 non-relevant metabolites. A consistent definition of non-relevant metabolites of plant protection products, as well as their uniform regulation in drinking and groundwater in the EU, is important to achieve legal clarity for all stakeholders and to establish planning security for development of plant protection products for the European market.

Investigation of Cost and Energy Optimization of Drinking Water Distribution Systems

Cherci, C., et.al., Environmental Science & Technology, 49.22:13724-13732, October 2015

Holistic management of water and energy resources through energy and water quality management systems (EWQMSs) have traditionally aimed at energy cost reduction with limited or no emphasis on energy efficiency or greenhouse gas minimization. This study expanded the existing EWQMS framework and determined the impact of different management strategies for energy cost and energy consumption (e.g., carbon footprint) reduction on system performance at two drinking water utilities in California (United States). The results showed that optimizing for cost led to cost reductions of 4% (Utility B, summer) to 48% (Utility A, winter). The energy optimization strategy was successfully able to find the lowest energy use operation and achieved energy usage reductions of 3% (Utility B, summer) to 10% (Utility A, winter). The findings of this study revealed that there may be a trade-off between cost optimization (dollars) and energy use (kilowatt-hours), particularly in the summer, when optimizing the system for the reduction of energy use to a minimum incurred cost increases of 64% and 184% compared with the cost optimization scenario. Water age simulations through hydraulic modeling did not reveal any adverse effects on the water quality in the distribution system or in tanks from pump schedule optimization targeting either cost or energy minimization.

PLOS ONE: A Systematic Review of Waterborne Disease Outbreaks Associated with Small Non-Community Drinking Water Systems in Canada and the United States

Pons, W., et al., PLoS ONE, October 2015

Reports of outbreaks in Canada and the United States (U.S.) indicate that approximately 50% of all waterborne diseases occur in small non-community drinking water systems (SDWSs). Summarizing these investigations to identify the factors and conditions contributing to outbreaks is needed in order to help prevent future outbreaks. The objectives of this study were to: 1) identify published reports of waterborne disease outbreaks involving SDWSs in Canada and the U.S. since 1970; 2) summarize reported factors contributing to outbreaks, including water system characteristics and events surrounding the outbreaks; and 3) identify terminology used to describe SDWSs in outbreak reports. Three electronic databases and grey literature sources were searched for outbreak reports involving SDWSs throughout Canada and the U.S. from 1970 to 2014. Two reviewers independently screened and extracted data related to water system characteristics and outbreak events. The data were analyzed descriptively with ‘outbreak’ as the unit of analysis. From a total of 1,995 citations, we identified 50 relevant articles reporting 293 unique outbreaks. Failure of an existing water treatment system (22.7%) and lack of water treatment (20.2%) were the leading causes of waterborne outbreaks in SDWSs. A seasonal trend was observed with 51% of outbreaks occurring in summer months (p<0.001). There was large variation in terminology used to describe SDWSs, and a large number of variables were not reported, including water source and whether water treatment was used (missing in 31% and 66% of reports, respectively). More consistent reporting and descriptions of SDWSs in future outbreak reports are needed to understand the epidemiology of these outbreaks and to inform the development of targeted interventions for SDWSs. Additional monitoring of water systems that are used on a seasonal or infrequent basis would be worthwhile to inform future protection efforts.

Incidence of waterborne lead in private drinking water systems in Virginia

Pieper, K.J., et.al, Journal of Water and Health, 13(3):897-908, September 2015

Although recent studies suggest contamination by bacteria and nitrate in private drinking water systems is of increasing concern, data describing contaminants associated with the corrosion of onsite plumbing are scarce. This study reports on the analysis of 2,146 samples submitted by private system homeowners. Almost 20% of first draw samples submitted contained lead concentrations above the United States Environmental Protection Agency action level of 15 μg/L, suggesting that corrosion may be a significant public health problem. Correlations between lead, copper, and zinc suggested brass components as a likely lead source, and dug/bored wells had significantly higher lead concentrations as compared to drilled wells. A random subset of samples selected to quantify particulate lead indicated that, on average, 47% of lead in the first draws was in the particulate form, although the occurrence was highly variable. While flushing the tap reduced lead below 15 μg/L for most systems, some systems experienced an increase, perhaps attributable to particulate lead or lead-bearing components upstream of the faucet (e.g., valves, pumps). Results suggest that without including a focus on private as well as municipal systems it will be very difficult to meet the existing national public health goal to eliminate elevated blood lead levels in children.

An evaluation of the readability of drinking water quality reports: a national assessment

Siddhartha, R., et.al., Journal of Water and Health, 13.3:645-653, September 2015

The United States Environmental Protection Agency mandates that community water systems (or water utilities) provide annual consumer confidence reports (CCRs)–water quality reports–to their consumers. These reports encapsulate information regarding sources of water, detected contaminants, regulatory compliance, and educational material. These reports have excellent potential for providing the public with accurate information on the safety of tap water, but there is a lack of research on the degree to which the information can be understood by a large proportion of the population. This study evaluated the readability of a nationally representative sample of 30 CCRs, released between 2011 and 2013. Readability (or ‘comprehension difficulty’) was evaluated using Flesch-Kincaid readability tests. The analysis revealed that CCRs were written at the 11th-14th grade level, which is well above the recommended 6th-7th grade level for public health communications. The CCR readability ease was found to be equivalent to that of the Harvard Law Review journal. These findings expose a wide chasm that exists between current water quality reports and their effectiveness toward being understandable to US residents. Suggestions for reorienting language and scientific information in CCRs to be easily comprehensible to the public are offered.

National Cost Implications of Potential Long-Term LCR Requirement

Slabaugh, R.M., et.al., American Water Works Association Journal, 107.8:389-400, August 2015

Concerns that the current Lead and Copper Rule (LCR) may not adequately protect public health have prompted the US Environmental Protection Agency (USEPA) to consider restructuring existing monitoring requirements by targeting a redefined pool of high-risk sites or altering the sampling protocol. Analysis of historical lead and copper monitoring data from 18 public water systems (PWSs) verified that a significant percentage of PWSs with lead service lines are likely to be affected by potential Long-Term Lead and Copper Rule (LT-LCR) revisions. Data were used to facilitate a national cost-of-compliance estimate for additional implementation of corrosion control treatment (CCT) necessary to comply with the LT-LCR and potential unintended consequences associated with those treatment changes. Cost estimates presented here can be used by USEPA to shape the upcoming rule and also by PWSs to assess potential costs associated with optimizing CCT for LT-LCR compliance.

Presence of antibiotic resistant bacteria and antibiotic resistance genes in raw source water and treated drinking water

Bergeron, S., et.al., International Biodeterioration & Biodegredation, 102:370-374August 2015

Antibiotic resistance is becoming a very large problem throughout the world. The spread of antibiotic resistant bacteria (ARB) and antibiotic resistance genes (ARGs) in the environment is a major public health issue. Aquatic ecosystem is a significant source for ARB and ARGs. The drinking water treatment system is designed specifically to eliminate bacteria and pathogens in drinking water. The presence of ARB and ARGs in source water and drinking water may affect public health and it is an emerging issue in drinking water industry. Therefore, this study was conducted to study the presence of ARB and ARGs in a source water, treated drinking water (finished water), and in the distribution line (tap water) in a rural water treatment plant in Louisiana. The results showed the presence of several ARB in the source water including, Enterobacter cloacae, Klebsiella pneumoniae, Escherichia coli, Pseudomonas, Enterococcus, Staphylococcus and Bacillus spp. However, the water treatment plant effectively removed these bacteria in the treated water as none of these bacteria were found in the tap water as well as in the finished water at the water treatment plant. Bacterial DNA including 16s rRNA and ARGs of sulfonamides and tetracycline antibiotics were observed in raw water. The presence of 16s rRNA was found consistently in every month of sampling in raw water, finished water, and tap water. This suggests that the filtration system at the treatment plant was ineffective in filtering out small fragments of bacterial DNA. Also, the possibility of the presence of biofilms in the water pipeline exists, which may develop antibiotic resistance due to the selective pressure of chlorination in drinking water.

Surveillance of perchlorate in ground water, surface water and bottled water in Kerala, India

Nadaraja, A.V., et.al., Journal of Environmental Health Science and Engineering, July 2015

Perchlorate is an emerging water contaminant that disrupts normal functioning of human thyroid gland and poses serious threat to health, especially for pregnant women, fetus and children. High level of perchlorate contamination in fresh water sources at places nearby ammonium perchlorate (rocket fuel) handled in bulk is reported in this study. Of 160 ground water samples analyzed from 27 locations in the State Kerala, 58 % had perchlorate above detection limit (2 μg/L) and the highest concentration observed was 7270 μg/L at Ernakulam district, this value is ~480 times higher than USEPA drinking water equivalent level (15 μg/L). Perchlorate was detected in all surface water samples analyzed (n = 10) and the highest value observed was 355 μg/L in Periyar river (a major river in the State). The bottled drinking water (n = 5) tested were free of perchlorate. The present study underlines the need for frequent screening of water sources for perchlorate contamination around places the chemical is handled in bulk. It will help to avoid human exposure to high levels of perchlorate.

Waterborne outbreaks in the Nordic countries, 1998 to 2012

Guzman-Herrador, B., et.al., Eurosurveillance, 20.24, June 2015

A total of 175 waterborne outbreaks affecting 85,995 individuals were notified to the national outbreak surveillance systems in Denmark, Finland and Norway from 1998 to 2012, and in Sweden from 1998 to 2011. Between 4 and 18 outbreaks were reported each year during this period. Outbreaks occurred throughout the countries in all seasons, but were most common (n = 75/169, 44%) between June and August. Viruses belonging to the Caliciviridae family and Campylobacter were the pathogens most frequently involved, comprising n = 51 (41%) and n = 36 (29%) of all 123 outbreaks with known aetiology respectively. Although only a few outbreaks were caused by parasites (Giardia and/or Cryptosporidium), they accounted for the largest outbreaks reported during the study period, affecting up to 53,000 persons. Most outbreaks, 124 (76%) of those with a known water source (n = 163) were linked to groundwater. A large proportion of the outbreaks (n = 130/170, 76%) affected a small number of people (less than 100 per outbreak) and were linked to single-household water supplies. However, in 11 (6%) of the outbreaks, more than 1,000 people became ill. Although outbreaks of this size are rare, they highlight the need for increased awareness, particularly of parasites, correct water treatment regimens, and vigilant management and maintenance of the water supply and distribution systems.

Microbial Health Risks of Regulated Drinking Waters in the United States — A Comparative Microbial Safety Assessment of Public Water Supplies and Bottled Water

Edberg, S.C., Topics in Public Health, June 2015

The quality of drinking water in the United States (U.S.) is extensively monitored and regulated by federal, state and local agencies, yet there is increasing public concern and confusion about the safety and quality of drinking water –– both from public water systems and from bottled water products. In the U.S., tap water and bottled water are regulated by two different agencies: the Environmental Protection Agency (EPA) regulates public water system water (tap water) and the Food and Drug Administration (FDA) regulates bottled water. Federal law requires that the FDA’s regulations for bottled water must be at least as protective of public health as EPA standards for tap water.

Performance Evaluation of an Italian Reference Method, the ISO Reference Method and a Chromogenic Rapid Method for the Detection of E. coli and Coliforms in Bottled Water

Di Pasquale, S. and Dario De Medici, Food Analytical Methods, 8.10:2417-2426, April 2015

Bottled water can be contaminated by coliforms and/or Escherichia coli (E. coli). These bacteria are considered as indicators of faecal pollution, and their detection in bottled water indicates the potential contamination by pathogenic enteric microorganisms. In recent decades, different methods were developed for the detection of coliforms and E. coli in drinking water and in bottled water including mineral water. Since 1976, the Italian regulation has defined microbiological methods to evaluate microbiological characteristics of mineral waters. Three different methods for the detection of coliforms and E. coli in bottled water were compared in this study: the Italian reference method, according to the “Italian Ministerial Rule,” the ISO 9308–1:2002 method, and a new rapid method. The results have demonstrated that the ISO method 9308–1:2002 and the new rapid method are as sensitive and specific as Italian reference method, and that both could be used to evaluate the contamination level of coliform and E. coli in drinking water and in bottled water including mineral water.

Microbial diversity and dynamics of a groundwater and a still bottled natural mineral water

Franca, L., et.al., Environmental Microbiology, 17.3:577-593, March 2015

The microbial abundance and diversity at source, after bottling and through 6 months of storage of a commercial still natural mineral water were assessed by culture-dependent and culture-independent methods. The results revealed clear shifts of the dominant communities present in the three different stages. The borehole waters displayed low cell densities that increased 1.5-fold upon bottling and storage, reaching a maximum (6.2 × 108 cells l−1) within 15 days after bottling, but experienced a significant decrease in diversity. In all cases, communities were largely dominated by Bacteria. The culturable heterotrophic community was characterized by recovering 3626 isolates, which were primarily affiliated with the Alphaproteobacteria, Betaproteobacteria and Gammaproteobacteria. This study indicates that bottling and storage induce quantitative and qualitative changes in the microbial assemblages that seem to be similar as revealed by the two sample batches collected on 2 consecutive years. To our knowledge, this is the first study combining culture-independent with culture-dependent methods, and repeated tests to reveal the microbial dynamics occurring from source to stored bottled water.

Molecular detection of Helicobacter pylori in a large Mediterranean river, by direct viable count fluorescent in situ hybridization (DVC-FISH

Tirodimos, L., et.al., Journal of Water and Health, 12.4:868-873, December 2014

Although the precise route and mode of transmission of Helicobacter pylori are still unclear, molecular methods have been applied for the detection of H. pylori in environmental samples. In this study, we used the direct viable count fluorescent in situ hybridization (DVC-FISH) method to detect viable cells of H. pylori in the River Aliakmon, Greece. This is the longest river in Greece, and provides potable water in metropolitan areas. H. pylori showed positive detection for 23 out of 48 water samples (47.9%), while no seasonal variation was found and no correlation was observed between the presence of H. pylori and indicators of fecal contamination. Our findings strengthen the evidence that H. pylori is waterborne while its presence adds to the potential health hazards of the River Aliakmon.

Chromium in drinking water: association with biomarkers of exposure and effect

Sazakli, E., et.al., International Journal of Environmental Research and Public Health, 11.10:10125-10145, October 2014

An epidemiological cross-sectional study was conducted in Greece to investigate health outcomes associated with long-term exposure to chromium via drinking water. The study population consisted of 304 participants. Socio-demographics, lifestyle, drinking water intake, dietary habits, occupational and medical history data were recorded through a personal interview. Physical examination and a motor test were carried out on the individuals. Total chromium concentrations were measured in blood and hair of the study subjects. Hematological, biochemical and inflammatory parameters were determined in blood. Chromium in drinking water ranged from <0.5 to 90 μg·L-1 in all samples but one (220 μg·L-1), with a median concentration of 21.2 μg·L-1. Chromium levels in blood (median 0.32 μg·L-1, range <0.18-0.92 μg·L-1) and hair (median 0.22 μg·g-1, range 0.03-1.26 μg·g-1) were found within “normal range” according to the literature. Personal lifetime chromium exposure dose via drinking water, calculated from the results of the water analyses and the questionnaire data, showed associations with blood and hair chromium levels and certain hematological and biochemical parameters. Groups of subjects whose hematological or biochemical parameters were outside the normal range were not correlated with chromium exposure dose, except for groups of subjects with high triglycerides or low sodium. Motor impairment score was not associated with exposure to chromium.

Obese children and adolescents need increased gastric volumes in order to perceive satiety

Mack, I., et.al., Obesity, 22.10:2123-2125, October 2014

In order to develop effective weight management strategies, it is important to identify factors that influence energy intake. Portion size has been discussed as one such factor. To date, most studies focusing on the relationship between portion size, energy intake, and weight have analyzed questionnaire data and 24-h records. In this study, we assessed the onset of satiety using the water-load test in normal-weight and obese children and adolescents. 60 obese and 27 normal-weight children and adolescents aged between 9 and 17 years participated in the water load test which involved drinking water for 3 min or until feeling full. The amount of water consumed was recorded. It was found that obese children and adolescents drank 20% more water until the onset of satiety when compared with normal-weight participants (478 ± 222 ml vs. 385 ± 115 ml, P < 0.05). Thus, it was concluded that obese children and adolescents need to ingest greater volumes to feel full which may predispose toward the consumption of larger portion sizes. This may easily lead to overeating if predominantly energy-dense foods are consumed. A reduction in energy-dense foods in the diet of obese children and adolescents appears to be a necessary strategy for managing body weight.

Naegleria fowleri: An emerging drinking water pathogen

Bartrand, T., et.al., American Water Works Association Journal, 106.10:418-432, October 2014

Naegleria fowleri (N. fowleri) is a free-living, trophic amoeba that is nearly ubiquitous in the environment and can be present in high numbers in warm waters. It is the causative agent of primary amoebic meningoencephalitis (PAM), a rare but particularly lethal disease with a very low survival incidence. Although N. fowleri was isolated from drinking water supplies in Australia in the 1980s, it was not considered a drinking water threat in the United States until recent cases were associated with a groundwater system in Arizona and surface water systems in Louisiana. N. fowleri in drinking water treatment and distribution systems can be managed using disinfectant concentrations typically encountered in well-run plants although nitrification and attendant low disinfectant residuals may pose a challenge for some systems. The greatest challenge for N. fowleri control is in premise plumbing systems where conditions are largely outside the control of utilities, residuals might be low or nonexistent, and where water temperatures could be high enough to support rapid growth of the amoebae. This article reviews published studies describing the environmental occurrence, survival, pathogenicity, and disinfection of N. fowleri. In addition, this article provides information about this little known and poorly understood parasite with respect to its occurrence in the environment; how the amoeba amplifies in water systems such that it can cause infection; how N. fowleri has been successfully controlled for decades in water systems through treatment and distribution system management in Australia; and the knowledge gaps and information needed to address N. fowleri as an emerging pathogen in US water supplies.

Emerging Trends in Groundwater Pollution and Quality

Kurwadkar, S., Water Environment Research, 86.10:1677-1691, October 2014

Groundwater pollution due to anthropogenic activities may impact overall groundwater quality. Organic and inorganic pollutants have been routinely detected at unsafe levels in groundwater rendering this important drinking water resource practically unusable. Vulnerability of groundwater pollution and subsequent impact has been documented in various studies across the globe. Field studies as well as mathematical models have demonstrated increasing levels of pollutants in both shallow and deep aquifer systems. New emerging pollutants such as organic micro-pollutants have also been detected in some industrialized as well as in developing countries. Increased vulnerability coupled with ever growing demand for groundwater may pose a greater threat of pollution due to induced recharge and lack of environmental safeguards to protect groundwater sources. In this review paper, comprehensive assessment of groundwater quality impact due to human activities such as improper management of organic and inorganic waste, and natural sources is documented. A detailed review of published reports and peer reviewed journal papers across the world clearly demonstrate that groundwater quality is declining over time. A proactive approach is needed to prevent human health and ecological consequences due to ingestion of contaminated groundwater.

Emerging Trends in Groundwater Pollution and Quality

Kurwadkar, S., Water Environment Research, pp. 1677-1691(15), October 2014

Groundwater pollution due to anthropogenic activities may impact overall groundwater quality. Organic and inorganic pollutants have been routinely detected at unsafe levels in groundwater rendering this important drinking water resource practically unusable. Vulnerability of groundwater pollution and subsequent impact has been documented in various studies across the globe. Field studies as well as mathematical models have demonstrated increasing levels of pollutants in both shallow and deep aquifer systems. New emerging pollutants such as organic micro-pollutants have also been detected in some industrialized as well as in developing countries. Increased vulnerability coupled with ever growing demand for groundwater may pose a greater threat of pollution due to induced recharge and lack of environmental safeguards to protect groundwater sources. In this review paper, comprehensive assessment of groundwater quality impact due to human activities such as improper management of organic and inorganic waste, and natural sources is documented. A detailed review of published reports and peer reviewed journal papers across the world clearly demonstrate that groundwater quality is declining over time. A proactive approach is needed to prevent human health and ecological consequences due to ingestion of contaminated groundwater.

Prenatal drinking-water exposure to tetrachloroethylene and ischemic placental disease: a retrospective cohort study | Environmental Health | Full Text

Carwile, J.L., et al., Environmental Health, September 2014

Prenatal drinking water exposure to tetrachloroethylene (PCE) has been previously related to intrauterine growth restriction and stillbirth. Pathophysiologic and epidemiologic evidence linking these outcomes to certain other pregnancy complications, including placental abruption, preeclampsia, and small-for-gestational-age (SGA) (i.e., ischemic placental diseases), suggests that PCE exposure may also be associated with these events. We examined whether prenatal exposure to PCE-contaminated drinking water was associated with overall or individual ischemic placental diseases. Using a retrospective cohort design, we compared 1,091 PCE-exposed and 1,019 unexposed pregnancies from 1,766 Cape Cod, Massachusetts women. Exposure between 1969 and 1990 was estimated using water distribution system modeling software. Data on birth weight and gestational age were obtained from birth certificates; mothers self-reported pregnancy complications. Of 2,110 eligible pregnancies, 9% (N = 196) were complicated by ≥1 ischemic placental disease. PCE exposure was not associated with overall ischemic placental disease (for PCE ≥ sample median vs. no exposure, risk ratio (RR): 0.90; 95% confidence interval (CI): 0.65, 1.24), preeclampsia (RR: 0.36; 95% CI: 0.12-1.07), or SGA (RR: 0.98; 95% CI: 0.66-1.45). However, pregnancies with PCE exposure ≥ the sample median had 2.38-times the risk of stillbirth ≥27 weeks gestation (95% CI: 1.01, 5.59), and 1.35-times of the risk of placental abruption (95% CI: 0.68, 2.67) relative to unexposed pregnancies. We concluded that prenatal PCE exposure was not associated with overall ischemic placental disease, but may increase risk of stillbirth.

Evaluation of long-term (1960-2010) groundwater fluoride contamination in Texas

Chaudhuri, S., and Srinivasulu Ale, Journal of Environmental Quality, 43.4:1404-1416, August 2014

Groundwater quality degradation is a major threat to sustainable development in Texas. The aim of this study was to elucidate spatiotemporal patterns of groundwater fluoride (F) contamination in different water use classes in 16 groundwater management areas in Texas between 1960 and 2010. Groundwater F concentration data were obtained from the Texas Water Development Board and aggregated over a decadal scale. Our results indicate that observations exceeding the drinking water quality threshold of World Health Organization (1.5 mg F L) and secondary maximum contaminant level (SMCL) (2 mg F L) of the USEPA increased from 26 and 19% in the 1960s to 37 and 23%, respectively, in the 2000s. In the 2000s, F observations > SMCL among different water use classes followed the order: irrigation (39%) > domestic (20%) > public supply (17%). Extent and mode of interaction between F and other water quality parameters varied regionally. In western Texas, high F concentrations were prevalent at shallower depths (<50 m) and were positively correlated with bicarbonate (HCO) and sulfate anions. In contrast, in southern and southeastern Texas, higher F concentrations occurred at greater depths (>50 m) and were correlated with HCO and chloride anions. A spatial pattern has become apparent marked by “excess” F in western Texas groundwaters as compared with “inadequate” F contents in rest of the state. Groundwater F contamination in western Texas was largely influenced by groundwater mixing and evaporative enrichment as compared with water-rock interaction and mineral dissolution in the rest of the state.

Sugar-Sweetened Beverage Consumption Among Adults — 18 States, 2012

Kumar, G.S., et al., CDC’s Morbidity and Mortality Weekly Report, August 2014

Reducing consumption of calories from added sugars is a recommendation of the 2010 Dietary Guidelines for Americans* and an objective of Healthy People 2020.† Sugar-sweetened beverages (SSB) are major sources of added sugars in the diets of U.S. residents (1). Daily SSB consumption is associated with obesity and other chronic health conditions, including diabetes and cardiovascular disease (2). U.S. adults consumed an estimated average of 151 kcal/day of SSB during 2009–2010, with regular (i.e., nondiet) soda and fruit drinks representing the leading sources of SSB energy intake (3,4). However, there is limited information on state-specific prevalence of SSB consumption. To assess regular soda and fruit drink consumption among adults in 18 states, CDC analyzed data from the 2012 Behavioral Risk Factor Surveillance System (BRFSS). Among the 18 states surveyed, 26.3% of adults consumed regular soda or fruit drinks or both ≥1 times daily. By state, the prevalence ranged from 20.4% to 41.4%. Overall, consumption of regular soda or fruit drinks was most common among persons aged 18‒34 years (24.5% for regular soda and 16.6% for fruit drinks), men (21.0% and 12.3%), non-Hispanic blacks (20.9% and 21.9%), and Hispanics (22.6% and 18.5%). Persons who want to reduce added sugars in their diets can decrease their consumption of foods high in added sugars such as candy, certain dairy and grain desserts, sweetened cereals, regular soda, fruit drinks, sweetened tea and coffee drinks, and other SSBs. States and health departments can collaborate with worksites and other community venues to increase access to water and other healthful beverages.

Water Distribution System Deficiencies and Gastrointestinal Illness: A Systematic Review and Meta-Analysis

Ercumen, A., et.al., Environmental Health Perspectives, 122.7:651-660, July 2014

Water distribution systems are vulnerable to performance deficiencies that can cause (re)contamination of treated water and plausibly lead to increased risk of gastrointestinal illness (GII) in consumers. It is well established that large system disruptions in piped water networks can cause GII outbreaks. We hypothesized that routine network problems can also contribute to background levels of waterborne illness and conducted a systematic review and meta-analysis to assess the impact of distribution system deficiencies on endemic GII. We reviewed published studies that compared direct tap water consumption to consumption of tap water re-treated at the point of use (POU) and studies of specific system deficiencies such as breach of physical or hydraulic pipe integrity and lack of disinfectant residual. In settings with network malfunction, consumers of tap water versus POU-treated water had increased GII [incidence density ratio (IDR) = 1.34; 95% CI: 1.00, 1.79]. The subset of nonblinded studies showed a significant association between GII and tap water versus POU-treated water consumption (IDR = 1.52; 95% CI: 1.05, 2.20), but there was no association based on studies that blinded participants to their POU water treatment status (IDR = 0.98; 95% CI: 0.90, 1.08). Among studies focusing on specific network deficiencies, GII was associated with temporary water outages (relative risk = 3.26; 95% CI: 1.48, 7.19) as well as chronic outages in intermittently operated distribution systems (odds ratio = 1.61; 95% CI: 1.26, 2.07). It was concluded that tap water consumption is associated with GII in malfunctioning distribution networks. System deficiencies such as water outages also are associated with increased GII, suggesting a potential health risk for consumers served by piped water networks

Ground water contamination with (238)U, (234)U, (235)U, (226)Ra and (210)Pb from past uranium mining: cove wash, Arizona

da Cunha, K.M.D., et.al., Environmental Geochemistry and Health, 36.3:477-487, June 2014

The objectives of the study are to present a critical review of the (238)U, (234)U, (235)U, (226)Ra and (210)Pb levels in water samples from the EPA studies (U.S. EPA in Abandoned uranium mines and the Navajo Nation: Red Valley chapter screening assessment report. Region 9 Superfund Program, San Francisco, 2004, Abandoned uranium mines and the Navajo Nation: Northern aum region screening assessment report. Region 9 Superfund Program, San Francisco, 2006, Health and environmental impacts of uranium contamination, 5-year plan. Region 9 Superfund Program, San Franciso, 2008) and the dose assessment for the population due to ingestion of water containing (238)U and (234)U. The water quality data were taken from Sect. “Data analysis” of the published report, titled Abandoned Uranium Mines Project Arizona, New Mexico, Utah-Navajo Lands 1994-2000, Project Atlas. Total uranium concentration was above the maximum concentration level for drinking water (7.410-1 Bq/L) in 19 % of the water samples, while (238)U and (234)U concentrations were above in 14 and 17 % of the water samples, respectively. (226)Ra and (210)Pb concentrations in water samples were in the range of 3.7 × 10(-1) to 5.55 × 102 Bq/L and 1.11 to 4.33 × 102 Bq/L, respectively. For only two samples, the (226)Ra concentrations exceeded the MCL for total Ra for drinking water (0.185 Bq/L). However, the (210)Pb/(226)Ra ratios varied from 0.11 to 47.00, and ratios above 1.00 were observed in 71 % of the samples. Secular equilibrium of the natural uranium series was not observed in the data record for most of the water samples. Moreover, the (235)U/(total)U mass ratios ranged from 0.06 to 5.9 %, and the natural mass ratio of (235)U to (total)U (0.72 %) was observed in only 16 % of the water samples, ratios above or below the natural ratio could not be explained based on data reported by U.S. EPA. In addition, statistical evaluations showed no correlations among the distribution of the radionuclide concentrations in the majority of the water samples, indicating more than one source of contamination could contribute to the sampled sources. The effective doses due to ingestion of the minimum uranium concentrations in water samples exceed the average dose considering inhalation and ingestion of regular diet for other populations around the world (1 μSv/year). The maximum doses due to ingestion of (238)U or (234)U were above the international limit for effective dose for members of the public (1 mSv/year), except for inhabitants of two chapters. The highest effective dose was estimated for inhabitants of Cove, and it was almost 20 times the international limit for members of the public. These results indicate that ingestion of water from some of the sampled sources poses health risks.

Leaching of bisphenol A and F from new and old epoxy coatings: laboratory and field studies

Bruchet, A., et.al., Water and Science Technology:Water Supply, 14.3:383-389, June 2014

Laboratory tests were carried out with three types of new epoxy resins to assess the release of bisphenol A and F (BPA and BPF) and potential halogenated phenolic by-products. Tests were carried out over a duration of 6 months in the presence and absence of disinfectants (chlorine and chlorine dioxide) at realistic doses and contact times. None of the three systems exhibited Fickian-type diffusion for BPA. Leaching was quite low for two epoxies while the third showed a trend of increasing leaching during the first 5 months of immersion. BPA was only observed in the absence of disinfectant while no BPF was observed under any condition. 2,4,6-trichlorophenol (TCP), a BPA chlorination by-product was sporadically observed in the chlorinated water during the first months of contact. Following discontinuation of the disinfectants, its release was significantly enhanced in the water having been exposed to chlorinated water. Laboratory leaching tests also indicated rapid oxidation of epoxies by chlorine and chlorine dioxide. Analysis of 27 epoxy-coated drinking water storage tanks did not reveal any BPA, BPF or TCP. On the other hand, a large-scale examination of about 200 pipe sections rehabilitated with epoxies during the 1990s led to a high frequency of BPA and BPF detection, sometimes with maximum values around 1 μg/L.

Contamination of Groundwater Systems in the US and Canada by Enteric Pathogens, 1990–2013: A Review and Pooled-Analysis

Hynds, P.D., Thomas, M.K., Pintar, K.D.M., PLOS ONE,Vol. 9, issue 5, e93301, May 2014

A combined review and pooled-analysis approach was used to investigate groundwater contamination in Canada and the US from 1990 to 2013; fifty-five studies met eligibility criteria. Four study types were identified. It was found that study location affects study design, sample rate and studied pathogen category. Approximately 15% (316/2210) of samples from Canadian and US groundwater sources were positive for enteric pathogens, with no difference observed based on system type. Knowledge gaps exist, particularly in exposure assessment for attributing disease to groundwater supplies. Furthermore, there is a lack of consistency in risk factor reporting (local hydrogeology, well type, well use, etc). The widespread use of fecal indicator organisms in reported studies does not inform the assessment of human health risks associated with groundwater supplies. This review illustrates how groundwater study design and location are critical for subsequent data interpretation and use. Knowledge gaps exist related to data on bacterial, viral and protozoan pathogen prevalence in Canadian and US groundwater systems, as well as a need for standardized approaches for reporting study design and results. Fecal indicators are examined as a surrogate for health risk assessments; caution is advised in their widespread use. Study findings may be useful during suspected waterborne outbreaks linked with a groundwater supply to identify the likely etiological agent and potential transport pathway.

Drinking water biofilm cohesiveness changes under chlorination or hydrodynamic stress

Mathieu, L., et al., Water Research, May 2014

Attempts at removal of drinking water biofilms rely on various preventive and curative strategies such as nutrient reduction in drinking water, disinfection or water flushing, which have demonstrated limited efficiency. The main reason for these failures is the cohesiveness of the biofilm driven by the physico-chemical properties of its exopolymeric matrix (EPS). Effective cleaning procedures should break up the matrix and/or change the elastic properties of bacterial biofilms. The aim of this study was to evaluate the change in the cohesive strength of two-month-old drinking water biofilms under increasing hydrodynamic shear stress τw (from ∼0.2 to ∼10 Pa) and shock chlorination (applied concentration at T0: 10 mg Cl2/L; 60 min contact time). Biofilm erosion (cell loss per unit surface area) and cohesiveness (changes in the detachment shear stress and cluster volumes measured by atomic force microscopy (AFM)) were studied. When rapidly increasing the hydrodynamic constraint, biofilm removal was found to be dependent on a dual process of erosion and coalescence of the biofilm clusters. Indeed, 56% of the biofilm cells were removed with, concomitantly, a decrease in the number of the 50–300 μm3 clusters and an increase in the number of the smaller (i.e., 600 μm3) ones. Moreover, AFM evidenced the strengthening of the biofilm structure along with the doubling of the number of contact points, NC, per cluster volume unit following the hydrodynamic disturbance. This suggests that the compactness of the biofilm exopolymers increases with hydrodynamic stress. Shock chlorination removed cells (−75%) from the biofilm while reducing the volume of biofilm clusters. Oxidation stress resulted in a decrease in the cohesive strength profile of the remaining drinking water biofilms linked to a reduction in the number of contact points within the biofilm network structure in particular for the largest biofilm cluster volumes (>200 μm3). Changes in the cohesive strength of drinking water biofilms subsequent to cleaning/disinfection operations call into question the effectiveness of cleaning-in-place procedures. The combined alternating use of oxidation and shear stress sequences needs to be investigated as it could be an important adjunct to improving biofilm removal/reduction procedures.

 

Large Outbreak of Cryptosporidium hominis Infection Transmitted through the Public Water Supply, Sweden

Widerström, M., et.al., Emerging Infectious Diseases,Vol 20 No 4, April 2014

In November 2010, ≈27,000 (≈45%) inhabitants of Östersund, Sweden, were affected by a waterborne outbreak of cryptosporidiosis. The outbreak was characterized by a rapid onset and high attack rate, especially among young and middle-aged persons. Young age, number of infected family members, amount of water consumed daily, and gluten intolerance were identified as risk factors for acquiring cryptosporidiosis. Also, chronic intestinal disease and young age were significantly associated with prolonged diarrhea. Identification of Cryptosporidium hominis subtype IbA10G2 in human and environmental samples and consistently low numbers of oocysts in drinking water confirmed insufficient reduction of parasites by the municipal water treatment plant. The current outbreak shows that use of inadequate microbial barriers at water treatment plants can have serious consequences for public health. This risk can be minimized by optimizing control of raw water quality and employing multiple barriers that remove or inactivate all groups of pathogens.

Microbial Contamination Detection in Water Resources: Interest of Current Optical Methods, Trends and Needs in the Context of Climate Change

Jung, A.V., et al., International Journal of Environmental Research and Public Health, 11(4), 4292-4310, April 2014

Microbial pollution in aquatic environments is one of the crucial issues with regard to the sanitary state of water bodies used for drinking water supply, recreational activities and harvesting seafood due to a potential contamination by pathogenic bacteria, protozoa or viruses. To address this risk, microbial contamination monitoring is usually assessed by turbidity measurements performed at drinking water plants. Some recent studies have shown significant correlations of microbial contamination with the risk of endemic gastroenteresis. However the relevance of turbidimetry may be limited since the presence of colloids in water creates interferences with the nephelometric response. Thus there is a need for a more relevant, simple and fast indicator for microbial contamination detection in water, especially in the perspective of climate change with the increase of heavy rainfall events. This review focuses on the one hand on sources, fate and behavior of microorganisms in water and factors influencing pathogens’ presence, transportation and mobilization, and on the second hand, on the existing optical methods used for monitoring microbiological risks. Finally, this paper proposes new ways of research.

Added Sugar Intake and Cardiovascular Diseases Mortality Among US Adults | Apr 01, 2014 | JAMA Internal Medicine | JAMA Network

Yang, Q. PhD, et al., JAMA Internal Medicine, April 2014

Epidemiologic studies have suggested that higher intake of added sugar is associated with cardiovascular disease (CVD) risk factors. Few prospective studies have examined the association of added sugar intake with CVD mortality. Our objective was to examine time trends of added sugar consumption as percentage of daily calories in the United States and investigate the association of this consumption with CVD mortality. We studied the National Health and Nutrition Examination Survey (NHANES, 1988-1994 [III], 1999-2004, and 2005-2010 [n = 31 147]) for the time trend analysis and NHANES III Linked Mortality cohort (1988-2006 [n = 11 733]), a prospective cohort of a nationally representative sample of US adults for the association study. We measured cardiovascular disease mortality. We found that among US adults, the adjusted mean percentage of daily calories from added sugar increased from 15.7% (95% CI, 15.0%-16.4%) in 1988-1994 to 16.8% (16.0%-17.7%; P = .02) in 1999-2004 and decreased to 14.9% (14.2%-15.5%; P < .001) in 2005-2010. Most adults consumed 10% or more of calories from added sugar (71.4%) and approximately 10% consumed 25% or more in 2005-2010. During a median follow-up period of 14.6 years, we documented 831 CVD deaths during 163 039 person-years. Age-, sex-, and race/ethnicity–adjusted hazard ratios (HRs) of CVD mortality across quintiles of the percentage of daily calories consumed from added sugar were 1.00 (reference), 1.09 (95% CI, 1.05-1.13), 1.23 (1.12-1.34), 1.49 (1.24-1.78), and 2.43 (1.63-3.62; P < .001), respectively. After additional adjustment for sociodemographic, behavioral, and clinical characteristics, HRs were 1.00 (reference), 1.07 (1.02-1.12), 1.18 (1.06-1.31), 1.38 (1.11-1.70), and 2.03 (1.26-3.27; P = .004), respectively. Adjusted HRs were 1.30 (95% CI, 1.09-1.55) and 2.75 (1.40-5.42; P = .004), respectively, comparing participants who consumed 10.0% to 24.9% or 25.0% or more calories from added sugar with those who consumed less than 10.0% of calories from added sugar. These findings were largely consistent across age group, sex, race/ethnicity (except among non-Hispanic blacks), educational attainment, physical activity, health eating index, and body mass index. We concluded that most US adults consume more added sugar than is recommended for a healthy diet. We observed a significant relationship between added sugar consumption and increased risk for CVD mortality.

Opportunistic pathogens in roof-captured rainwater samples, determined using quantitative PCR. – PubMed – NCBI

Ahmed, W., Water Research, April 2014

In this study, quantitative PCR (qPCR) was used for the detection of four opportunistic bacterial pathogens in water samples collected from 72 rainwater tanks in Southeast Queensland, Australia. Tank water samples were also tested for fecal indicator bacteria (Escherichia coli and Enterococcus spp.) using culture-based methods. Among the 72 tank water samples tested, 74% and 94% samples contained E. coli and Enterococcus spp., respectively, and the numbers of E. coli and Enterococcus spp. in tank water samples ranged from 0.3 to 3.7 log₁₀ colony forming units (CFU) per 100 mL of water. In all, 29%, 15%, 13%, and 6% of tank water samples contained Aeromonas hydrophila, Staphylococcus aureus, Pseudomonas aeruginosa and Legionella pneumophila, respectively. The genomic units (GU) of opportunistic pathogens in tank water samples ranged from 1.5 to 4.6 log₁₀ GU per 100 mL of water. A significant correlation was found between E. coli and Enterococcus spp. numbers in pooled tank water samples data (Spearman’s rs = 0.50; P < 0.001). In contrast, fecal indicator bacteria numbers did not correlate with the presence/absence of opportunistic pathogens tested in this study. Based on the results of this study, it would be prudent, to undertake a Quantitative Microbial Risk Assessment (QMRA) analysis of opportunistic pathogens to determine associated health risks for potable and nonpotable uses of tank water.

 

Assessing Exposure and Health Consequences of Chemicals in Drinking Water: Current State of Knowledge and Research Needs

Villanueva, C.M., et.al., Environmental Health Perspectives, 122.3:213-221, March 2014

Safe drinking water is essential for well-being. Although microbiological contamination remains the largest cause of water-related morbidity and mortality globally, chemicals in water supplies may also cause disease, and evidence of the human health consequences is limited or lacking for many of them.We aimed to summarize the state of knowledge, identify gaps in understanding, and provide recommendations for epidemiological research relating to chemicals occurring in drinking water. Assessing exposure and the health consequences of chemicals in drinking water is challenging. Exposures are typically at low concentrations, measurements in water are frequently insufficient, chemicals are present in mixtures, exposure periods are usually long, multiple exposure routes may be involved, and valid biomarkers reflecting the relevant exposure period are scarce. In addition, the magnitude of the relative risks tends to be small. Research should include well-designed epidemiological studies covering regions with contrasting contaminant levels and sufficient sample size; comprehensive evaluation of contaminant occurrence in combination with bioassays integrating the effect of complex mixtures; sufficient numbers of measurements in water to evaluate geographical and temporal variability; detailed information on personal habits resulting in exposure (e.g., ingestion, showering, swimming, diet); collection of biological samples to measure relevant biomarkers; and advanced statistical models to estimate exposure and relative risks, considering methods to address measurement error. Last, the incorporation of molecular markers of early biological effects and genetic susceptibility is essential to understand the mechanisms of action. There is a particular knowledge gap and need to evaluate human exposure and the risks of a wide range of emerging contaminants.

Spatial analysis of boil water advisories issued during an extreme weather event in the Hudson River Watershed, USA

Vedachalam, S., et.al., Applied Geography, 48:112-121, March 2014

Water infrastructure in the United States is aging and vulnerable to extreme weather. In August 2011, Tropical Storm Irene hit the eastern part of New York and surrounding states, causing great damage to public drinking water systems. Several water supply districts issued boil water advisories (BWAs) to their customers as a result of the storm. This study seeks to identify the major factors that lead water supply systems to issue BWAs by assessing watershed characteristics, water supply system characteristics and treatment plant parameters of water districts in the Mohawk-Hudson River watershed in New York. Logistic regression model suggests that the probability of a BWA being issued by a water supply district is enhanced by higher precipitation during the storm, high density of septic systems, lack of recent maintenance and low population density. Interviews with water treatment plant operators suggested physical damage to water distribution systems were the main causes of boil water advisories during storms. BWAs result in additional costs to residents and communities, and the public compliance of the advisory instructions is low, so efforts must be made to minimize their occurrence. Prior investments in infrastructure management can proactively address municipal water supply and quality issues.

 

Epidemiology and estimated costs of a large waterborne outbreak of norovirus infection in Sweden

Larsson, C., et al., Epidemiology and Infection, 142(3):592-600, March 2014

A large outbreak of norovirus (NoV) gastroenteritis caused by contaminated municipal drinking water occurred in Lilla Edet, Sweden, 2008. Epidemiological investigations performed using a questionnaire survey showed an association between consumption of municipal drinking water and illness (odds ratio 4·73, 95% confidence interval 3·53-6·32), and a strong correlation between the risk of being sick and the number of glasses of municipal water consumed. Diverse NoV strains were detected in stool samples from patients, NoV genotype I strains predominating. Although NoVs were not detected in water samples, coliphages were identified as a marker of viral contamination. About 2400 (18·5%) of the 13,000 inhabitants in Lilla Edet became ill. Costs associated with the outbreak were collected via a questionnaire survey given to organizations and municipalities involved in or affected by the outbreak. Total costs including sick leave, were estimated to be ∼8,700,000 Swedish kronor (∼€0·87 million).

Methyl Tertiary Butyl Ether (MTBE) and Other Volatile Organic Compounds (VOCs) in Public Water Systems, Private Wells, and Ambient Groundwater Wells in New Jersey Compared to Regulatory and Human-Health Benchmarks

Williams, P.R.D., Environmental Forensics, Volume 15, Issue 1, February 2014

Potential threats to drinking water and water quality continue to be a major concern in many regions of the United States. New Jersey, in particular, has been at the forefront of assessing and managing potential contamination of its drinking water supplies from hazardous substances. The purpose of the current analysis is to provide an up-to-date evaluation of the occurrence and detected concentrations of methyl tertiary butyl ether (MTBE) and several other volatile organic compounds (VOCs) in public water systems, private wells, and ambient groundwater wells in New Jersey based on the best available data, and to put these results into context with federal and state regulatory and human-health benchmarks. Analyses are based on the following three databases that contain water quality monitoring data for New Jersey: Safe Drinking Water Information System (SDWIS), Private Well Testing Act (PWTA), and National Water Information System (NWIS). For public water systems served by groundwater in New Jersey, MTBE was detected at a concentration ≥10 μg/L, ≥20 μg/L, and ≥70 μg/L at least once in 30 (2%), 21 (1.4%), and five (0.3%) of sampled systems from 1997 to 2011, respectively. For private wells in New Jersey, MTBE was detected at a concentration ≥10 μg/L, ≥20 μg/L, and ≥70 μg/L at least once in 385 (0.5%), 183 (0.2%), and 46 (0.05%) of sampled wells from 2001 to 2011, respectively. For ambient groundwater wells in New Jersey, MTBE was detected at a concentration ≥10 μg/L, ≥20 μg/L, and ≥70 μg/L at least once in 14 (2.1%), 9 (1.3%), and 4 (0.6%) of sampled wells from 1993 to 2012, respectively. Average detected concentrations of MTBE, as well as detected concentrations at upper-end percentiles, were less than corresponding benchmarks for all three datasets. The available data show that MTBE is rarely detected in various source waters in New Jersey at a concentration that exceeds the State’s health-based drinking water standard or other published benchmarks, and there is no evidence of an increasing trend in the detection frequency of MTBE. Other VOCs, such as tetrachloroethylene (PCE), trichloroethylene (TCE), and benzene, are detected more often above corresponding regulatory or human-health benchmarks due to their higher detected concentrations in water and/or greater toxicity values. The current analysis provides useful data for evaluating the nature and extent of historical and current contamination of water supplies in New Jersey and potential opportunities for public exposures and health risks due to MTBE and other VOCs on a statewide basis. Additional forensic or forecasting analyses are required to identify the sources or timing of releases of individual contaminants at specific locations or to predict potential future water contamination in New Jersey.

Widespread Molecular Detection of Legionella pneumophila Serogroup 1 in Cold Water Taps across the United States

Donohue, M.J., Environmental Science and Technology, 48 (6), pp 3145–3152, February 2014

In the United States, 6,868 cases of legionellosis were reported to the Center for Disease Control and Prevention in 2009–2010. Of these reports, it is estimated that 84% are caused by the microorganism Legionella pneumophila Serogroup (Sg) 1. Legionella spp. have been isolated and recovered from a variety of natural freshwater environments. Human exposure to L. pneumophila Sg1 may occur from aerosolization and subsequent inhalation of household and facility water. In this study, two primer/probe sets (one able to detect L. pneumophila and the other L. pneumophila Sg1) were determined to be highly sensitive and selective for their respective targets. Over 272 water samples, collected in 2009 and 2010 from 68 public and private water taps across the United States, were analyzed using the two qPCR assays to evaluate the incidence of L. pneumophila Sg1. Nearly half of the taps showed the presence of L. pneumophila Sg1 in one sampling event, and 16% of taps were positive in more than one sampling event. This study is the first United States survey to document the occurrence and colonization of L. pneumophila Sg1 in cold water delivered from point of use taps.

Perspectives on drinking water monitoring for small scale water systems

Roig, B., Baures, E., Thomas, O., Water Science & Technology: Water Supply, Vol. 14 Issue 1, p1, January 2014

Drinking water (DW) is increasingly subject to environmental and human threats that alter the quality of the resource and potentially of the distributed water. These threats can be both biological and chemical in nature, and are often cumulated. The increase of technical frame of water quality monitoring following the evolution of water quality standards guarantee the regulation compliance in general but is not sufficient for the survey of small scale water system efficiency. The existing monitoring is not well suited to insure a good quality of distributed water, especially in the event of a sudden modification of quality. This article aims to propose alternative solutions, from the examination of monitoring practices, in a bid to limit the risk of deterioration of DW quality.

Drinking Water Microbial Myths

Martin, J.A., et al., Critical Reviews in Microbiology, November 2013

Accounts of drinking water-borne disease outbreaks have always captured the interest of the public, elected and health officials, and the media. During the twentieth century, the drinking water community and public health organizations have endeavored to craft regulations and guidelines on treatment and management practices that reduce risks from drinking water, specifically human pathogens. During this period there also evolved misunderstandings as to potential health risk associated with microorganisms that may be present in drinking waters. These misunderstanding or “myths” have led to confusion among the many stakeholders. The purpose of this article is to provide a scientific- and clinically-based discussion of these “myths” and recommendations for better ensuring the microbial safety of drinking water and valid public health decisions.

Assessing the impact of chlorinated-solvent sites on metropolitan groundwater resources

Brusseau, M.L. and Narter, M., Ground Water, November 2013

Chlorinated-solvent compounds are among the most common groundwater contaminants in the United States. A majority of the many sites contaminated by chlorinated-solvent compounds are located in metropolitan areas, and most such areas have one or more chlorinated-solvent contaminated sites. Thus, contamination of groundwater by chlorinated-solvent compounds may pose a potential risk to the sustainability of potable water supplies for many metropolitan areas. The impact of chlorinated-solvent sites on metropolitan water resources was assessed for Tucson, Arizona, by comparing the aggregate volume of extracted groundwater for all pump-and-treat systems associated with contaminated sites in the region to the total regional groundwater withdrawal. The analysis revealed that the aggregate volume of groundwater withdrawn for the pump-and-treat systems operating in Tucson, all of which are located at chlorinated-solvent contaminated sites, was 20% of the total groundwater withdrawal in the city for the study period. The treated groundwater was used primarily for direct delivery to local water supply systems or for reinjection as part of the pump-and-treat system. The volume of the treated groundwater used for potable water represented approximately 13% of the total potable water supply sourced from groundwater, and approximately 6% of the total potable water supply. This case study illustrates the significant impact chlorinated-solvent contaminated sites can have on groundwater resources and regional potable water supplies.

Radon-contaminated drinking water from private wells: an environmental health assessment examining a rural Colorado mountain community’s exposure

Cappello, M.A., et. al., Journal of Environmental Health, November 2013

In the study discussed in this article, 27 private drinking water wells located in a rural Colorado mountain community were sampled for radon contamination and compared against (a) the U.S. Environmental Protection Agency’s (U.S. EPA’s) proposed maximum contaminant level (MCL), (b) the U.S. EPA proposed alternate maximum contaminate level (AMCL), and (c) the average radon level measured in the local municipal drinking water system. The data from the authors’ study found that 100% of the wells within the study population had radon levels in excess of the U.S. EPA MCL, 37% were in excess of the U.S. EPA AMCL, and 100% of wells had radon levels greater than that found in the local municipal drinking water system. Radon contamination in one well was found to be 715 times greater than the U.S. EPA MCL, 54 times greater than the U.S. EPA AMLC, and 36,983 times greater than that found in the local municipal drinking water system. According to the research data and the reviewed literature, the results indicate that this population has a unique and elevated contamination profile and suggest that radon-contaminated drinking water from private wells can present a significant public health concern.

Water and beverage consumption among adults in the United States: cross-sectional study using data from NHANES 2005–2010

Drewnowski, A., et. al., BMC Public Health, November 2013

Few studies have examined plain water consumption among US adults. This study evaluated the consumption of plain water (tap and bottled) and total water among US adults by age group (20-50y, 51-70y, and ≥71y), gender, income-to-poverty ratio, and race/ethnicity. Data from up to two non-consecutive 24-hour recalls from the 2005–2006, 2007–2008 and 2009–2010 National Health and Nutrition Examination Survey (NHANES) was used to evaluate usual intake of water and water as a beverage among 15,702 US adults. The contribution of different beverage types (e.g., water as a beverage [tap or bottled], milk [including flavored], 100% fruit juice, soda/soft drinks [regular and diet], fruit drinks, sports/energy drinks, coffee, tea, and alcoholic beverages) to total water and energy intakes was examined. Total water intakes from plain water, beverages, and food were compared to the Adequate Intake (AI) values from the US Dietary Reference Intakes (DRI). Total water volume per 1,000 kcal was also examined.Water and other beverages contributed 75-84% of dietary water, with 17-25% provided by water in foods, depending on age. Plain water, from tap or bottled sources, contributed 30-37% of total dietary water. Overall, 56% of drinking water volume was from tap water while bottled water provided 44%. Older adults (≥71y) consumed much less bottled water than younger adults. Non-Hispanic whites consumed the most tap water, whereas Mexican-Americans consumed the most bottled water. Plain water consumption (bottled and tap) tended to be associated with higher incomes. On average, younger adults exceeded or came close to satisfying the DRIs for water. Older men and women failed to meet the Institute of Medicine (IOM) AI values, with a shortfall in daily water intakes of 1218 mL and 603 mL respectively. Eighty-three percent of women and 95% of men ≥71y failed to meet the IOM AI values for water. However, average water volume per 1,000 kcal was 1.2-1.4 L/1,000 kcal for most population sub-groups, higher than suggested levels of 1.0 L/1.000 kcal. It was concluded that water intakes below IOM-recommended levels may be a cause for concern, especially for older adults.

Microbial Health Risks of Regulated Drinking Water in the United States

Edberg, S.C., DWRF, September 2013

Drinking water regulations are designed to protect the public health. In the United States, the Environmental Protection Agency (EPA) is tasked with developing and maintaining drinking water regulations for the 276,607,387 people served by the country’s 54,293 community water systems. The Food and Drug Administration (FDA) regulates bottled water as a food product. By federal law, the FDA’s regulations for bottled water must be at least as protective of public health as the EPA’s regulations for public water system drinking water. Despite many similarities in EPA and FDA regulations, consumer perception regarding the safety of drinking waters varies widely. This paper examines and compares the microbial health risks of tap water and bottled water, specifically examining differences in quality monitoring, regulatory standards violations, advisories, and distribution system conditions. It also includes comparison data on the number of waterborne illness outbreaks caused by both tap and bottled water. Based on a review of existing research, it is clear that as a consequence of the differences in regulations, distribution systems, operating (manufacturing) practices, and microbial standards of quality, public drinking water supplies present a substantially higher human risk than do bottled waters for illness due to waterborne organisms.

The mineral content of tap water in United States households

Patterson, K.Y., et. al., Journal of Food Composition and Analysis, August 2013

The composition of tap water contributes to dietary intake of minerals. The Nutrient Data Laboratory (NDL) of the United States Department of Agriculture (USDA) conducted a study of the mineral content of residential tap water, to generate current data for the USDA National Nutrient Database. Sodium, potassium, calcium, magnesium, iron, copper, manganese, phosphorus, and zinc content of drinking water were determined in a nationally representative sampling. The statistically designed sampling method identified 144 locations for water collection in winter and spring from home taps. Assuming a daily consumption of 1 L of tap water, only four minerals (Cu, Ca, Mg, and Na), on average, provided more than 1% of the US dietary reference intake. Significant decreases in calcium were observed with chemical water softeners, and between seasonal pickups for Mg and Ca. The variance of sodium was significantly different among regions (p < 0.05) but no differences were observed as a result of collection time, water source or treatment. Based on the weighted mixed model results, there were no significant differences in overall mineral content between municipal and well water. These results, which are a nationally representative dataset of mineral values for drinking water available from home taps, provides valuable additional information for assessment of dietary mineral intake.

Quantitative analysis of microbial contamination in private drinking water supply systems

Allevi, R.P., et al., Journal of Water and Health, June 2013

Over one million households rely on private water supplies (e.g. well, spring, cistern) in the Commonwealth of Virginia, USA. The present study tested 538 private wells and springs in 20 Virginia counties for total coliforms (TCs) and Escherichia coli along with a suite of chemical contaminants. A logistic regression analysis was used to investigate potential correlations between TC contamination and chemical parameters (e.g. NO3(-), turbidity), as well as homeowner-provided survey data describing system characteristics and perceived water quality. Of the 538 samples collected, 41% (n = 221) were positive for TCs and 10% (n = 53) for E. coli. Chemical parameters were not statistically predictive of microbial contamination. Well depth, water treatment, and farm location proximate to the water supply were factors in a regression model that predicted presence/absence of TCs with 74% accuracy. Microbial and chemical source tracking techniques (Bacteroides gene Bac32F and HF183 detection via polymerase chain reaction and optical brightener detection via fluorometry) identified four samples as likely contaminated with human wastewater.

Water and beverage consumption among children age 4-13y in the United States: analyses of 2005–2010 NHANES data

Drewnowski, A., Rehm, C., and Constant, F., Nutrition Journal, June 2013

Few studies have examined water consumption patterns among US children. Additionally, recent data on total water consumption as it relates to the Dietary Reference Intakes (DRI) are lacking. This study evaluated the consumption of plain water (tap and bottled) and other beverages among US children by age group, gender, income-to-poverty ratio, and race/ethnicity. Comparisons were made to DRI values for water consumption from all sources. Data from two non-consecutive 24-hour recalls from 3 cycles of NHANES (2005–2006, 2007–2008 and 2009–2010) were used to assess water and beverage consumption among 4,766 children age 4-13y. Beverages were classified into 9 groups: water (tap and bottled), plain and flavored milk, 100% fruit juice, soda/soft drinks (regular and diet), fruit drinks, sports drinks, coffee, tea, and energy drinks. Total water intakes from plain water, beverages, and food were compared to DRIs for the US. Total water volume per 1,000 kcal was also examined. It was found that water and other beverages contributed 70-75% of dietary water, with 25-30% provided by moisture in foods, depending on age. Plain water, tap and bottled, contributed 25-30% of total dietary water. In general, tap water represented 60% of drinking water volume whereas bottled water represented 40%. Non-Hispanic white children consumed the most tap water, whereas Mexican-American children consumed the most bottled water. Plain water consumption (bottled and tap) tended to be associated with higher incomes. No group of US children came close to satisfying the DRIs for water. At least 75% of children 4-8y, 87% of girls 9-13y, and 85% of boys 9-13y did not meet DRIs for total water intake. Water volume per 1,000 kcal, another criterion of adequate hydration, was 0.85-0.95 L/1,000 kcal, short of the desirable levels of 1.0-1.5 L/1,000 kcal. It was concluded that water intakes at below-recommended levels may be a cause for concern. Data on water and beverage intake for the population and by socio-demographic group provides useful information to target interventions for increasing water intake among children.

Resolved: there is sufficient scientific evidence that decreasing sugar-sweetened beverage consumption will reduce the prevalence of obesity and obesity-related diseases

Hu, F.B., Obesity Reviews, June 2013

Sugar-sweetened beverages (SSBs) are the single largest source of added sugar and the top source of energy intake in the U.S. diet. In this review, we evaluate whether there is sufficient scientific evidence that decreasing SSB consumption will reduce the prevalence of obesity and its related diseases. Because prospective cohort studies address dietary determinants of long-term weight gain and chronic diseases, whereas randomized clinical trials (RCTs) typically evaluate short-term effects of specific interventions on weight change, both types of evidence are critical in evaluating causality. Findings from well-powered prospective cohorts have consistently shown a significant association, established temporality and demonstrated a direct dose–response relationship between SSB consumption and long-term weight gain and risk of type 2 diabetes (T2D). A recently published meta-analysis of RCTs commissioned by the World Health Organization found that decreased intake of added sugars significantly reduced body weight (0.80 kg, 95% confidence interval [CI] 0.39–1.21; P < 0.001), whereas increased sugar intake led to a comparable weight increase (0.75 kg, 0.30–1.19; P = 0.001). A parallel meta-analysis of cohort studies also found that higher intake of SSBs among children was associated with 55% (95% CI 32–82%) higher risk of being overweight or obese compared with those with lower intake. Another meta-analysis of eight prospective cohort studies found that one to two servings per day of SSB intake was associated with a 26% (95% CI 12–41%) greater risk of developing T2D compared with occasional intake (less than one serving per month). Recently, two large RCTs with a high degree of compliance provided convincing data that reducing consumption of SSBs significantly decreases weight gain and adiposity in children and adolescents. Taken together, the evidence that decreasing SSBs will decrease the risk of obesity and related diseases such as T2D is compelling. Several additional issues warrant further discussion. First, prevention of long-term weight gain through dietary changes such as limiting consumption of SSBs is more important than short-term weight loss in reducing the prevalence of obesity in the population. This is due to the fact that once an individual becomes obese, it is difficult to lose weight and keep it off. Second, we should consider the totality of evidence rather than selective pieces of evidence (e.g. from short-term RCTs only). Finally, while recognizing that the evidence of harm on health against SSBs is strong, we should avoid the trap of waiting for absolute proof before allowing public health action to be taken.

Impact of fluid intake in the prevention of urinary system diseases: a brief review

Lotan, et al., Current Opinions in Nephrology and Hypertension, Vol. 22, sup. 1, May 2013

We are often told that we should be drinking more water, but the rationale for this remains unclear and no recommendations currently exist for a healthy fluid intake supported by rigorous scientific evidence. Crucially, the same lack of evidence precludes the claim that a high fluid intake has no clinical benefit. The aim of this study is to describe the mechanisms by which chronic low fluid intake may play a crucial role in the pathologies of four key diseases of the urinary system: urolithiasis, urinary tract infection, chronic kidney disease and bladder cancer. Although primary and secondary intervention studies evaluating the impact of fluid intake are lacking, published data from observational studies appears to suggest that chronic low fluid intake may be an important factor in the pathogenesis of these diseases.

Relation between urinary hydration biomarkers and total fluid intake in healthy adults

Perrier, E., et al., European Journal of Clinical Nutrition,  May 2013

In sedentary adults, hydration is mostly influenced by total fluid intake and not by sweat losses; moreover, low daily fluid intake is associated with adverse health outcomes. This study aimed to model the relation between total fluid intake and urinary hydration biomarkers. During 4 consecutive weekdays, 82 adults (age, 31.6±4.3 years; body mass index, 23.2±2.7 kg/m2; 52% female) recorded food and fluid consumed, collected one first morning urine (FMU) void and three 24-h (24hU) samples. The strength of linear association between urinary hydration biomarkers and fluid intake volume was evaluated using simple linear regression and Pearson’s correlation. Multivariate partial least squares (PLS) modeled the association between fluid intake and 24hU hydration biomarkers. Strong associations (|r|greater than or equal to0.6; P<0.001) were found between total fluid intake volume and 24hU osmolality, color, specific gravity (USG), volume and solute concentrations. Many 24hU biomarkers were collinear (osmolality versus color: r=0.49–0.76; USG versus color: r=0.46–0.78; osmolality versus USG: 0.86–0.97; P<0.001). Measures in FMU were not strongly correlated to intake. Multivariate PLS and simple linear regression using urine volume explained >50% of the variance in fluid intake volume (r2=0.59 and 0.52, respectively); however the error in both models was high and the limits of agreement very large. It was concluded that hydration biomarkers in 24hU are strongly correlated with daily total fluid intake volume in sedentary adults in free-living conditions; however, the margin of error in the present models limits the applicability of estimating fluid intake from urinary biomarkers.

Strontium Concentrations in Corrosion Products from Residential Drinking Water Distribution Systems

Gerke, et al., Environmental Science and Technology, April 22, 2013.

The United States Environmental Protection Agency (US EPA) will require some U.S. drinking water distribution systems (DWDS) to monitor nonradioactive strontium (Sr2+) in drinking water in 2013. Iron corrosion products from four DWDS were examined to assess the potential for Sr2+ binding and release. Average Sr2+ concentrations in the outermost layer of the corrosion products ranged from 3 to 54 mg kg–1 and the Sr2+ drinking water concentrations were all ≤0.3 mg L–1. Micro-X-ray adsorption near edge structure spectroscopy and linear combination fitting determined that Sr2+ was principally associated with CaCO3. Sr2+ was also detected as a surface complex associated with α-FeOOH. Iron particulates deposited on a filter inside a home had an average Sr2+ concentration of 40.3 mg kg–1 and the associated drinking water at a tap was 210 μg L–1. The data suggest that elevated Sr2+ concentrations may be associated with iron corrosion products that, if disturbed, could increase Sr2+ concentrations above the 0.3 μg L–1 US EPA reporting threshold. Disassociation of very small particulates could result in drinking water Sr2+ concentrations that exceed the US EPA health reference limit (4.20 mg kg–1 body weight).

Association between Water Intake, CKD, and Cardiovascular Disease: A Cross-Sectional Analysis of NHANES Data

Sontrop, et al., American Journal of Nephrology, 37:434-442, April 2013

Recent evidence from animal and human studies suggests that a higher water intake may have a protective effect on kidney function and cardiovascular disease. We wish to examine the association between water intake, chronic kidney disease and cardiovascular disease in a cross-sectional analysis of the 2005-2006 National Health and Nutrition Examination Survey Population. Total water intake from food and beverages was categorized as low – that is less than 2 litres per day, moderate – 2 to 4.3 litres per day and high – greater than 4.3 litres per day. We examined the associations between the low total water intake and chronic kidney disease and self-reported cardiovascular disease. Key Findings: Of the 3427 adults, whose mean age was 46, with a mean eGFR of 95ml/min/1.73m2, 13% had chronic kidney disease and 18% suffered cardiovascular disease. Chronic kidney disease was higher among those with the lowest (less than 2 litres of fluid per day) versus the highest total water intake (greater than 4.3 litres per day), (odds ratio 2.52, 95% confidence interval, 0.91-6.96). Once stratified by the intake of plain water and other beverages, CKD was associated with a low intake of plain water with an odds ratio of 2.36 at 95% confidence intervals of 1.1-5.06 but not other beverages. There was no association between low water intake and cardiovascular disease.

Evaluating violations of drinking water regulations

Rubin, S.J., Journal, American Water Works Association, March 2013

US Environmental Protection Agency data were analyzed for violations by community water systems (CWSs). Several characteristics were evaluated, including size, source water, and violation type. The data show that: (1) 55% of CWSs violated at least one regulation under the Safe Drinking Water Act that involved systems serving more than 95 million people; (2) the presence of violations was no different for groundwater and surface water systems; (3) fewer than 20% of CWSs with violations exceeded an allowable level of a contaminant in drinking water; (4) smaller water systems are no more likely than larger systems, except very large systems, to violate health-related requirements; and (5) smaller CWSs appear more likely than larger systems to violate monitoring, reporting, and notification requirements. An evaluation was also conducted of four contaminants that had health-related violations by more than 1% of CWSs: total coliform, stage 1 disinfection by-products, arsenic, and lead and copper.

Lead (Pb) quantification in potable water samples: implications for regulatory compliance and assessment of human exposure

Triantafyllidou, S., et al., Environmental Monitoring and Assessment, February 2013

Assessing the health risk from lead (Pb) in potable water requires accurate quantification of the Pb concentration. Under worst-case scenarios of highly contaminated water samples, representative of public health concerns, up to 71-98 % of the total Pb was not quantified if water samples were not mixed thoroughly after standard preservation (i.e., addition of 0.15 % (v/v) HNO(3)). Thorough mixing after standard preservation improved recovery in all samples, but 35-81 % of the total Pb was still un-quantified in some samples. Transfer of samples from one bottle to another also created high errors (40-100 % of the total Pb was un-quantified in transferred samples). Although the United States Environmental Protection Agency’s standard protocol avoids most of these errors, certain methods considered EPA-equivalent allow these errors for regulatory compliance sampling. Moreover, routine monitoring for assessment of human Pb exposure in the USA has no standardized protocols for water sample handling and pre-treatment. Overall, while there is no reason to believe that sample handling and pre-treatment dramatically skew regulatory compliance with the US Pb action level, slight variations from one approved protocol to another may cause Pb-in-water health risks to be significantly underestimated, especially for unusual situations of “worst case” individual exposure to highly contaminated water.

Blood pressure hyperreactivity: an early cardiovascular risk in normotensive men exposed to low-to-moderate inorganic arsenic in drinking water

Kunrath, J., et al., Journal of Hypertension, February 2013

Essential hypertension is associated with chronic exposure to high levels of inorganic arsenic in drinking water. However, early signs of risk for developing hypertension remain unclear in people exposed to chronic low-to-moderate inorganic arsenic. We evaluated cardiovascular stress reactivity and recovery in healthy, normotensive, middle-aged men living in an arsenic-endemic region of Romania. Unexposed (n = 16) and exposed (n = 19) participants were sampled from communities based on WHO limits for inorganic arsenic in drinking water (20 mmHg) and DBP (>15 mmHg). We found that drinking water inorganic arsenic averaged 40.2 ± 30.4 and 1.0 ± 0.2 μg/l for the exposed and unexposed groups, respectively (P < 0.001). Compared to the unexposed group, the exposed group expressed a greater probability of blood pressure hyperreactivity to both anticipatory stress (47.4 vs. 12.5%; P = 0.035) and cold stress (73.7 vs. 37.5%; P = 0.044). Moreover, the exposed group exhibited attenuated blood pressure recovery from stress and a greater probability of persistent hypertensive responses (47.4 vs. 12.5%; P = 0.035). We concluded inorganic arsenic exposure increased stress-induced blood pressure hyperreactivity and poor blood pressure recovery, including persistent hypertensive responses in otherwise healthy, clinically normotensive men. Drinking water containing even low-to-moderate inorganic arsenic may act as a sympathetic nervous system trigger for hypertension risk.

Water consumption, not expectancies about water consumption, affects cognitive performance in adults

Edmonds, C.J., et al., Appetite, January 2013

Research has shown that water supplementation positively affects cognitive performance in children and adults. The present study considered whether this could be a result of expectancies that individuals have about the effects of water on cognition. Forty-seven participants were recruited and told the study was examining the effects of repeated testing on cognitive performance. They were assigned either to a condition in which positive expectancies about the effects of drinking water were induced, or a control condition in which no expectancies were induced. Within these groups, approximately half were given a drink of water, while the remainder were not. Performance on a thirst scale, letter cancellation, digit span forwards and backwards and a simple reaction time task was assessed at baseline (before the drink) and 20 min and 40 min after water consumption. Effects of water, but not expectancy, were found on subjective thirst ratings and letter cancellation task performance, but not on digit span or reaction time. This suggests that water consumption effects on letter cancellation are due to the physiological effects of water, rather than expectancies about the effects of drinking water.

Changes in water and beverage intake and long-term weight changes: results from three prospective cohort studies

Pan, A., et al., International Journal of Obesity, January 2013

We aimed to examine the long-term relationship between changes in water and beverage intake and weight change. Our subjects were prospective cohort studies of 50 013 women aged 40–64 years in the Nurses’ Health Study (NHS, 1986–2006), 52 987 women aged 27–44 years in the NHS II (1991–2007) and 21 988 men aged 40–64 years in the Health Professionals Follow-up Study (1986–2006) without obesity and chronic diseases at baseline.We assessed the association of weight change within each 4-year interval, with changes in beverage intakes and other lifestyle behaviors during the same period. Multivariate linear regression with robust variance and accounting for within-person repeated measures were used to evaluate the association. Results across the three cohorts were pooled by an inverse-variance-weighted meta-analysis.We found participants gained an average of 1.45 kg (5th to 95th percentile: −1.87 to 5.46) within each 4-year period. After controlling for age, baseline body mass index and changes in other lifestyle behaviors (diet, smoking habits, exercise, alcohol, sleep duration, TV watching), each 1 cup per day increment of water intake was inversely associated with weight gain within each 4-year period (−0.13 kg; 95% confidence interval (CI): −0.17 to −0.08). The associations for other beverages were: sugar-sweetened beverages (SSBs) (0.36 kg; 95% CI: 0.24–0.48), fruit juice (0.22 kg; 95% CI: 0.15–0.28), coffee (−0.14 kg; 95% CI: −0.19 to −0.09), tea (−0.03 kg; 95% CI: −0.05 to −0.01), diet beverages (−0.10 kg; 95% CI: −0.14 to −0.06), low-fat milk (0.02 kg; 95% CI: −0.04 to 0.09) and whole milk (0.02 kg; 95% CI: −0.06 to 0.10). We estimated that replacement of 1 serving per day of SSBs by 1 cup per day of water was associated with 0.49 kg (95% CI: 0.32–0.65) less weight gain over each 4-year period, and the replacement estimate of fruit juices by water was 0.35 kg (95% CI: 0.23–0.46). Substitution of SSBs or fruit juices by other beverages (coffee, tea, diet beverages, low-fat and whole milk) were all significantly and inversely associated with weight gain. Our results suggest that increasing water intake in place of SSBs or fruit juices is associated with lower long-term weight gain.

Influence of progressive fluid restriction on mood and physiological markers of dehydration in women

Pross, N., et al., British Journal of Nutrition, 109, 313-321, January 2013

The present study evaluated, using a well-controlled dehydration protocol, the effects of 24 h fluid deprivation (FD) on selected mood and physiological parameters. In the present cross-over study, twenty healthy women (age 25 (SE 0·78) years) participated in two randomised sessions: FD-induced dehydration v. a fully hydrated control condition. In the FD period, the last water intake was between 18.00 and 19.00 hours and no beverages were allowed until 18.00 hours on the next day (23–24 h). Water intake was only permitted at fixed periods during the control condition. Physiological parameters in the urine, blood and saliva (osmolality) as well as mood and sensations (headache and thirst) were compared across the experimental conditions. Safety was monitored throughout the study. The FD protocol was effective as indicated by a significant reduction in urine output. No clinical abnormalities of biological parameters or vital signs were observed, although heart rate was increased by FD. Increased urine specific gravity, darker urine colour and increased thirst were early markers of dehydration. Interestingly, dehydration also induced a significant increase in saliva osmolality at the end of the 24 h FD period but plasma osmolality remained unchanged. The significant effects of FD on mood included decreased alertness and increased sleepiness, fatigue and confusion. The most consistent effects of mild dehydration on mood are on sleep/wake parameters. Urine specific gravity appears to be the best physiological measure of hydration status in subjects with a normal level of activity; saliva osmolality is another reliable and noninvasive method for assessing hydration status.

Hydration, Morbidity, and Mortality in Vulnerable Populations

Maughan, R.J., Nutrition Reviews, 70(2):152-155, November 2012

Both acute and chronic fluid deficits have been shown to be associated with a number of adverse health outcomes. At the extreme, deprivation of water for more than a few days inevitably leads to death, but even modest fluid deficits may precipitate adverse events, especially in young children, in the frail elderly and in those with poor health. Epidemiological studies have shown an association, although not necessarily a causal one, between a low habitual fluid intake and some chronic diseases, including urolithiasis, constipation, asthma, cardiovascular disease, diabetic hyperglycemia, and some cancers. Acute hypohydration may be a precipitating factor in a number of acute medical conditions in elderly persons. Increased mortality, especially in vulnerable populations, is commonly observed during periods of abnormally warm weather, with at least part of this effect due to failure to increase water intake, and this may have some important implications for those responsible for forward planning in healthcare facilities.

The need for congressional action to finance arsenic reductions in drinking water

Levine, R.L., Journal of Environmental Health, November 2012

Many public water systems in the U.S. are unsafe because the communities cannot afford to comply with the current 10 parts per billion (ppb) federal arsenic standard for drinking water. Communities unable to afford improvements remain vulnerable to adverse health effects associated with higher levels of arsenic exposure. Scientific and bipartisan political consensus exists that the arsenic standard should not be less stringent than 10 ppb, and new data suggest additional adverse health effects related to arsenic exposure through drinking water. Congress has failed to reauthorize the Drinking Water State Revolving Fund program to provide reliable funding to promote compliance and reduce the risk of adverse health effects. Congress’s recent ad hoc appropriations do not allow long-term planning and ongoing monitoring and maintenance. Investing in water infrastructure will lower health care costs and create American jobs. Delaying necessary upgrades will only increase the costs of improvements over time.

Direct healthcare costs of selected diseases primarily or partially transmitted by water

Collier, S.A., et al., Epidemiology and Infection, November 2012

Despite US sanitation advancements, millions of waterborne disease cases occur annually, although the precise burden of disease is not well quantified. Estimating the direct healthcare cost of specific infections would be useful in prioritizing waterborne disease prevention activities. Hospitalization and outpatient visit costs per case and total US hospitalization costs for ten waterborne diseases were calculated using large healthcare claims and hospital discharge databases. The five primarily waterborne diseases in this analysis (giardiasis, cryptosporidiosis, Legionnaires’ disease, otitis externa, and non-tuberculous mycobacterial infection) were responsible for over 40 000 hospitalizations at a cost of $970 million per year, including at least $430 million in hospitalization costs for Medicaid and Medicare patients. An additional 50 000 hospitalizations for campylobacteriosis, salmonellosis, shigellosis, haemolytic uraemic syndrome, and toxoplasmosis cost $860 million annually ($390 million in payments for Medicaid and Medicare patients), a portion of which can be assumed to be due to waterborne transmission.

Clues to the Future of the Park Doctrine

Burroughs, A.D., and Rin, D., Food and Drug Law Institute, November/December 2012

This article examines three recent cases brought under the controversial Park doctrine in search of clues to the doctrine’s future. The responsible corporate officer (RCO) doctrine, also known as the Park doctrine, allows for criminal prosecution of individuals, typically high-ranking corporate executives of pharmaceutical companies, for violations of the Food, Drug and Cosmetic Act (FDCA), even absent any proof of the individual defendant’s knowledge of or participation in the violation. It is relevant to drinking water because the Park law applies to bottled water, but not to tap water.

Arcobacter in Lake Erie Beach Waters: an Emerging Gastrointestinal Pathogen Linked with Human-Associated Fecal Contamination

Lee, C., et al., Applied and Environmental Microbiology, September 2012

The genus Arcobacter has been associated with human illness and fecal contamination by humans and animals. To better characterize the health risk posed by this emerging waterborne pathogen, we investigated the occurrence of Arcobacter spp. in Lake Erie beach waters. During the summer of 2010, water samples were collected 35 times from the Euclid, Villa Angela, and Headlands (East and West) beaches, located along Ohio’s Lake Erie coast. After sample concentration, Arcobacter was quantified by real-time PCR targeting the Arcobacter 23S rRNA gene. Other fecal genetic markers (Bacteroides 16S rRNA gene [HuBac], Escherichia coli uidA gene, Enterococcus 23S rRNA gene, and tetracycline resistance genes) were also assessed. Arcobacter was detected frequently at all beaches, and both the occurrence and densities of Arcobacter spp. were higher at the Euclid and Villa Angela beaches (with higher levels of fecal contamination) than at the East and West Headlands beaches. The Arcobacter density in Lake Erie beach water was significantly correlated with the human-specific fecal marker HuBac according to Spearman’s correlation analysis (r = 0.592; P < 0.001). Phylogenetic analysis demonstrated that most of the identified Arcobacter sequences were closely related to Arcobacter cryaerophilus, which is known to cause gastrointestinal diseases in humans. Since human-pathogenic Arcobacter spp. are linked to human-associated fecal sources, it is important to identify and manage the human-associated contamination sources for the prevention of Arcobacter-associated public health risks at Lake Erie beaches.

Sugar-Sweetened Beverages and Genetic Risk of Obesity

Qi, Q. PhD, et al., The New England Journal of Medicine, September 2012

Temporal increases in the consumption of sugar-sweetened beverages have paralleled the rise in obesity prevalence, but whether the intake of such beverages interacts with the genetic predisposition to adiposity is unknown. We analyzed the interaction between genetic predisposition and the intake of sugar-sweetened beverages in relation to body-mass index (BMI; the weight in kilograms divided by the square of the height in meters) and obesity risk in 6934 women from the Nurses’ Health Study (NHS) and in 4423 men from the Health Professionals Follow-up Study (HPFS) and also in a replication cohort of 21,740 women from the Women’s Genome Health Study (WGHS). The genetic-predisposition score was calculated on the basis of 32 BMI-associated loci. The intake of sugar-sweetened beverages was examined prospectively in relation to BMI. In the NHS and HPFS cohorts, the genetic association with BMI was stronger among participants with higher intake of sugar-sweetened beverages than among those with lower intake. In the combined cohorts, the increases in BMI per increment of 10 risk alleles were 1.00 for an intake of less than one serving per month, 1.12 for one to four servings per month, 1.38 for two to six servings per week, and 1.78 for one or more servings per day.

The Quality of Drinking Water in North Carolina Farmworker Camps

Bischoff, W.E., MD, PhD, et al.American Journal of Public Health, August 2012

The purpose of this study was to assess water quality in migrant farmworker camps in North Carolina and determine associations of water quality with migrant farmworker housing characteristics. Researchers collected data from 181 farmworker camps in eastern North Carolina during the 2010 agricultural season. Water samples were tested using the Total Coliform Rule (TCR) and housing characteristics were assessed using North Carolina Department of Labor standards. A total of 61 (34%) of 181 camps failed the TCR. Total coliform bacteria were found in all 61 camps, with Escherichia coli also being detected in 2. Water quality was not associated with farmworker housing characteristics or with access to registered public water supplies. Multiple official violations of water quality standards had been reported for the registered public water supplies. They concluded that water supplied to farmworker camps often does not comply with current standards and poses a great risk to the physical health of farmworkers and surrounding communities. Expansion of water monitoring to more camps and changes to the regulations such as testing during occupancy and stronger enforcement are needed to secure water safety.

Chemical mixtures in untreated water from public-supply wells in the U.S. — Occurrence, composition, and potential toxicity

Toccalino, P.L., Norman, J.E., Scott, J.C., Science of The Total Environment, August 2012

Chemical mixtures are prevalent in groundwater used for public water supply, but little is known about their potential health effects. As part of a large-scale ambient groundwater study, we evaluated chemical mixtures across multiple chemical classes, and included more chemical contaminants than in previous studies of mixtures in public-supply wells. We (1) assessed the occurrence of chemical mixtures in untreated source-water samples from public-supply wells, (2) determined the composition of the most frequently occurring mixtures, and (3) characterized the potential toxicity of mixtures using a new screening approach. The U.S. Geological Survey collected one untreated water sample from each of 383 public wells distributed across 35 states, and analyzed the samples for as many as 91 chemical contaminants. Concentrations of mixture components were compared to individual human-health benchmarks; the potential toxicity of mixtures was characterized by addition of benchmark-normalized component concentrations. Most samples (84%) contained mixtures of two or more contaminants, each at concentrations greater than one-tenth of individual benchmarks. The chemical mixtures that most frequently occurred and had the greatest potential toxicity primarily were composed of trace elements (including arsenic, strontium, or uranium), radon, or nitrate. Herbicides, disinfection by-products, and solvents were the most common organic contaminants in mixtures. The sum of benchmark-normalized concentrations was greater than 1 for 58% of samples, suggesting that there could be potential for mixtures toxicity in more than half of the public-well samples. Our findings can be used to help set priorities for groundwater monitoring and suggest future research directions for drinking-water treatment studies and for toxicity assessments of chemical mixtures in water resources.

Risk of Viral Acute Gastrointestinal Illness from Nondisinfected Drinking Water Distribution Systems

Lambertini, E., et al., Environmental Science and Technology, July 2012

Acute gastrointestinal illness (AGI) resulting from pathogens directly entering the piping of drinking water distribution systems is insufficiently understood. Here, we estimate AGI incidence from virus intrusions into the distribution systems of 14 nondisinfecting, groundwater-source, community water systems. Water samples for virus quantification were collected monthly at wells and households during four 12-week periods in 2006–2007. Ultraviolet (UV) disinfection was installed on the communities’ wellheads during one study year; UV was absent the other year. UV was intended to eliminate virus contributions from the wells and without residual disinfectant present in these systems, any increase in virus concentration downstream at household taps represented virus contributions from the distribution system (Approach 1). During no-UV periods, distribution system viruses were estimated by the difference between well water and household tap virus concentrations (Approach 2). For both approaches, a Monte Carlo risk assessment framework was used to estimate AGI risk from distribution systems using study-specific exposure–response relationships. Depending on the exposure–response relationship selected, AGI risk from the distribution systems was 0.0180–0.0661 and 0.001–0.1047 episodes/person-year estimated by Approaches 1 and 2, respectively. These values represented 0.1–4.9% of AGI risk from all exposure routes, and 1.6–67.8% of risk related to drinking water exposure. Virus intrusions into nondisinfected drinking water distribution systems can contribute to sporadic AGI.

Methodological Aspects of Fluid Intake Records and Surveys

Vergne, S. PhD, Nutrition Today, July/August 2012

Assessing the fluid intake level of different populations has, to date, attracted very little interest. The comparison of existing data based on food surveys reveals notable differences between countries and within different surveys in 1 country. Methodological issues seem to account to a large extent for these differences. Recent studies conducted using specifically designed diaries to record fluid and water intake over a 7-day period tend to give more accurate results. These recent studies could potentially lead to the revision of the values of adequate intakes of water in numerous countries.

 

First Findings of the United Kingdom Fluid Intake Study

Gandy, J. PhD, RD, Nutrition Today, July/August 2012

Many factors influence the rising levels of obesity including changes in activity and dietary patterns. National dietary surveys are valuable tools for identifying sources of energy, including sugar, in a population. However, these surveys use diaries that aim to capture food intake rather than fluid intake and may underestimate beverage and therefore energy intakes. A study of 1456 children and adults was conducted in the United Kingdom using a 7-day fluid specific intake diary. Total daily intakes of beverages were higher in all ages compared with previous surveys. However, 30% of adults and more than 50% of children did not meet European adequate intake for total water. Children consumed on average 175 kcal/d as still or carbonated soft drinks. This fluid-specific survey raises concerns about the total and type of fluids consumed by both adults and children in the United Kingdom.

 

Fluids, Water, and Nutrients and the Risk of Renal Diseases: Where We Stand and a Research Agenda

Strippoli, G.F.M. MD, PhD, MPH, MM, Nutrition Today, July/August 2012

Chronic kidney disease (CKD) is a major public health challenge. Despite identification of several established cardiovascular and renal risk factors and addressing them with multiple pharmacological interventions, people with CKD continue to die, and the rate of progression of kidney disease continues to increase. In this article, we review existing evidence on the role of fluid (total fluid including fluid from water and fluid from food) and nutrient intake and the risk of kidney disease and its progression and propose a research agenda for future studies in the area. It is plausible that water and nutrient intake is an easy-to-implement strategy to reduce the risk of CKD and its progression and adverse outcomes at a population level. Cross-sectional and prospective cohort studies first and subsequently randomized trials are needed to establish the strength of association between fluid/water/nutrient intake and risk of CKD and adverse outcomes and whether a causal link exists between these exposures and the adverse outcomes.

 

Promotion and Provision of Drinking Water in Schools for Overweight Prevention: Randomized, Controlled Cluster Trial

Muckelbauer, R. MSc, et al., Nutrition Today, July/August 2012

The prevention of childhood overweight is a major public health challenge. Intervention trials have shown that schools are a promising setting for overweight prevention. To date, no particular intervention has been proved to be effective in overweight prevention. This study showed that a simple intervention with the sole focus of promoting water consumption effectively prevented overweight among children in elementary schools in socially deprived urban areas. The study tested whether a combined environmental and educational intervention solely promoting water consumption was effective in preventing overweight among children in elementary school. The participants in this randomized, controlled cluster trial were second- and third-graders from 32 elementary schools in socially deprived areas of 2 German cities. Water fountains were installed and teachers presented 4 prepared classroom lessons in the intervention group schools (N = 17) to promote water consumption. Control group schools (N = 15) did not receive any intervention. The prevalence of overweight (defined according to the International Obesity Task Force criteria), BMI SD scores, and beverage consumption (in glasses per day; 1 glass was defined as 200 mL) self-reported in 24-hour recall questionnaires, were determined before (baseline) and after the intervention. In addition, the water flow of the fountains was measured during the intervention period of 1 school year (August 2006 to June 2007). Data on 2950 children (intervention group: N = 1641; control group: N = 1309; age, mean ± SD: 8.3 ± 0.7 years) were analyzed. After the intervention, the risk of overweight was reduced by 31% in the intervention group, compared with the control group, with adjustment for baseline prevalence of overweight and clustering according to school. Changes in BMI SD scores did not differ between the intervention group and the control group. Water consumption after the intervention was 1.1 glasses per day greater in the intervention group. No intervention effect on juice and soft drink consumption was found. Daily water flow of the fountains indicated lasting use during the entire intervention period, but to varying extent. Our environmental and educational, school-based intervention proved to be effective in the prevention of overweight among children in elementary school, even in a population from socially deprived areas.

Abbreviations: CG—control group, CI—confidence interval, IG—intervention group, SDS—SD score

 

Hydration Biomarkers During Daily Life: Recent Advances and Future Potential

Armstrong, L.E. PhD, Nutrition Today, July/August 2012

Human fluid-electrolyte regulation involves multiple neuroendocrine responses to the hourly loss and gain of body water. Such dynamic complexity complicates hydration assessment and minimizes the likelihood that any single biomarker will validly and precisely describe hydration status in all life situations. This article describes the biomarkers that are currently used during daily living to assess mild dehydration, plus recent advances in our understanding of invasive and noninvasive techniques. This article also suggests directions for future exploration of novel hydration indices, in the belief that superior biomarkers exist but have not been discovered.

 

Hydration Status in Active Youth

Kavouras, S. A. PhD, Arnaoutis, G. PhD, Nutrition Today, July/August 2012

Fluid balance is crucial for maintaining health. It is well documented that dehydration increases physiologic strain and decreases athletic performance, especially in hot environments. Although there are numerous studies evaluating hydration status in adults, limited data concerning hydration levels in athletic youth exist. Nevertheless, most of these studies clearly indicate that (a) dehydration is a major and common problem within children exercising in the heat; and (b) children do not have the capacity to translate hydration awareness to successful hydration strategies. Further research is needed, and constant efforts must be made toward the development of more efficient hydration strategies in order to educate young people about the benefits of optimal hydration status.

 

Effect of a 24-Hour Fluid Deprivation on Mood and Physiological Hydration Markers in Women

Pross, N. PhD, Nutrition Today, July/August 2012

The study aim was to evaluate the effect of an acute fluid deprivation (FD) on mood and physiological parameters. Twenty healthy women (aged 25 ± 3.5 years) participated in a randomized 2-period (dehydrated vs control) crossover study. In the FD period, the last water intake was between 6 PM and 7 PM, and no fluid intake was allowed up to 6 PM on the next day. The FD resulted in increased sleepiness and fatigue, decreased alertness, and increased confusion. In this rigorously controlled protocol, the early noninvasive markers of dehydration were a reduced urine volume, increased urinary gravity, darker urine color, and increased thirst. Interestingly, dehydration also induced a significant increase in saliva osmolality at the end FD period. Plasma osmolality did not differ between experimental conditions.

 

Impact of Beverage Content on Health and the Kidneys

Johnson, R.J. MD, et al., Nutrition Today, July/August 2012

The last 50 years have witnessed an epidemic rise in obesity, diabetes, high blood pressure, and chronic kidney disease. Some animal research suggests the epidemic may in part be triggered by sugar. Sugar contains glucose and fructose, and studies suggest it is the fructose component that may have a role in chronic disease development. Animal studies indicate that fructose is distinct from other sugars by its ability to cause transient adenosine triphosphate (ATP) depletion in the cell with uric acid generation. The administration of fructose, or the raising of uric acid, can induce kidney disease and accelerate established kidney disease in animals. Therefore, we believe that the greatest risk from sugar is when it is given as a soft drink, as the rapidity of ingestion relates directly to the concentration of fructose that the cells are exposed to and hence govern the degree of ATP depletion and uric acid generation. Restricting sugar-sweetened beverages may be one strategy to combat obesity, diabetes, high blood pressure, and kidney disease, but human intervention studies are needed to support the theory.

 

Hydration: What Is Known and What Is Unknown

Rosenbloom, C. PhD RD, Nutrition Today, July/August 2012

In a 2007 article, Lawrence Armstrong of the Human Performance Institute at the University of Connecticut remarked that, when it came to assessing hydration status, we were still searching for the “elusive gold standard.” Hydration is one of those topics, like the weather, that everyone talks about but no one can make accurate predictions about. This seems to hold true for water in the form of rain… and also the water in our bodies that hydrates us. Nevertheless, athletes want to know exactly how much water they need under various exercise conditions, parents want to know if certain sugar-containing beverages will make their children obese, clinicians want to know how hydration affects chronic disease risk, and everyone wants to know if they should be carrying around a water bottle all day long! But all we can often do as nutrition scientists and exercise physiologists is to give general responses because the topic remains so “elusive.”

 

Hydration biomarkers and dietary fluid consumption of women

Armstrong, L., et. al. Journal of the Academy of Nutrition and Dietetics, July 2012

Normative values and confidence intervals for the hydration indices of women do not exist. Also, few publications have precisely described the fluid types and volumes that women consume. This investigation computed seven numerical reference categories for widely used hydration biomarkers (eg, serum and urine osmolality) and the dietary fluid preferences of self-reported healthy, active women. Participants (n=32; age 20±1 years; body mass 59.6±8.5 kg; body mass index [calculated as kg/m(2)] 21.1±2.4) were counseled in the methods to record daily food and fluid intake on 2 consecutive days. To reduce day-to-day body water fluctuations, participants were tested only during the placebo phase of the oral contraceptive pill pack. Euhydration was represented by the following ranges: serum osmolality=293 to 294 mOsm/kg; mean 24-hour total fluid intake=2,109 to 2,506 mL/24 hours; mean 24-hour total beverage intake=1,300 to 1,831 mL/24 hours; urine volume=951 to 1,239 mL/24 hours; urine specific gravity=1.016 to 1.020; urine osmolality=549 to 705 mOsm/kg; and urine color=5. However, only 3% of women experienced a urine specific gravity <1.005, and only 6% exhibited a urine color of 1 or 2. Water (representing 45.3% and 47.9% of 24-hour total fluid intake), tea, milk, coffee, and fruit juice were consumed in largest volumes. In conclusion, these data provide objective normative values for hyperhydration, euhydration, and dehydration that can be used by registered dietitians and clinicians to counsel women about their hydration status.

Observations of drinking water access in school food service areas before implementation of federal and state school water policy, California, 2011

Patel, A., et al., Preventing Chronic Disease, July 2012

Recent legislation requires schools to provide free drinking water in food service areas (FSAs). Our objective was to describe access to water at baseline and student water intake in school FSAs and to examine barriers to and strategies for implementation of drinking water requirements. We randomly sampled 24 California Bay Area public schools. We interviewed 1 administrator per school to assess knowledge of water legislation and barriers to and ideas for policy implementation. We observed water access and students’ intake of free water in school FSAs. Wellness policies were examined for language about water in FSAs. We found that fourteen of 24 schools offered free water in FSAs; 10 offered water via fountains, and 4 provided water through a nonfountain source. Four percent of students drank free water at lunch; intake at elementary schools (11%) was higher than at middle or junior high schools (6%) and high schools (1%). In secondary schools when water was provided by a nonfountain source, the percentage of students who drank free water doubled. Barriers to implementation of water requirements included lack of knowledge of legislation, cost, and other pressing academic concerns. No wellness policies included language about water in FSAs. We concluded that approximately half of schools offered free water in FSAs before implementation of drinking water requirements, and most met requirements through a fountain. Only 1 in 25 students drank free water in FSAs. Although schools can meet regulations through installation of fountains, more appealing water delivery systems may be necessary to increase students’ water intake at mealtimes.

French children start their school day with a hydration deficit

Bonnet, F., et al., Annals of Nutrition & Metabolism, June 2012

Fluid requirements of children vary as a function of gender and age. To our knowledge, there is very little literature on the hydration status of French children. We assessed the morning hydration status in a large sample of 529 French schoolchildren aged 9–11 years. Methods: Recruited children completed a questionnaire on fluid and food intake at breakfast and collected a urine sample the very same day after breakfast. Breakfast food and fluid nutritional composition was analyzed and urine osmolality was measured using a cryoscopic osmometer. More than a third of the children had a urine osmolality between 801 and 1,000 mosm/kg while 22.7% had a urine osmolality over 1,000 mosm/kg. This was more frequent in boys than in girls (p ! 0.001). A majority of children (73.5%) drank less than 400 ml at breakfast. Total water intake at breakfast was significantly and inversely correlated with high osmolality values. It was concluded that almost two thirds of the children in this large cohort had evidence of a hydration deficit when they went to school in the morning, despite breakfast intake. Children’s fluid intake at breakfast does not suffice to maintain an adequate hydration status for the whole morning.

Lead (Pb) in Tap Water and in Blood: Implications for Lead Exposure in the United States (PDF Download Available)

Triantafyllidou, S. and Edwards, M., Critical Reviews in Environmental Science and Technology, June 2012

Lead is widely recognized as one of the most pervasive environmental health threats in the United States, and there is increased concern over adverse health impacts at levels of exposure once considered safe. Lead contamination of tap water was once a major cause of lead exposure in the United States and, as other sources have been addressed, the relative contribution of lead in water to lead in blood is expected to become increasingly important. Moreover, prior research suggests that lead in water may be more important as a source than is presently believed. The authors describe sources of lead in tap water, chemical forms of the lead, and relevant U.S. regulations/guidelines, while considering their implications for human exposure. Research that examined associations between water lead levels and blood lead levels is critically reviewed, and some of the challenges in making such associations, even if lead in water is the dominant source of lead in blood, are highlighted. Better protecting populations at risk from this and from other lead sources is necessary, if the United States is to achieve its goal of eliminating elevated blood lead levels in children by 2020.

Protective Nutrients: Are They Here to Stay?

Walker, W.A. MD, Heintz, K. MS, Nutrition Today, May/June 2012

Protective nutrients benefit health in various ways beyond their conventionally established nutrient function such as by enhancing immune function, promoting gastrointestinal integrity, impacting metabolism, and preventing disease. Certain of these key nutrients have taken center stage as emerging research is showing that they can play a significant role throughout the life span. Study of an infants’ first natural nutrition, breast milk, has led to an improved understanding of how different compounds can beneficially effect physiological processes and act as protective nutrients. Probiotics, or “healthy bacteria,” are living microorganisms that confer a benefit when consumed in sufficient quantities. For example, certain strains help maintain the balance of the intestinal microbiota, a complex ecosystem that can be influenced by many factors such as stress, antibiotics, and diet. Research suggests that, when the intestinal microbiota is unbalanced, overall health may be affected. Prebiotics are nondigestible carbohydrates that can be used as an energy source by certain probiotics, thereby helping them grow and flourish to further promote a healthy ecosystem. Additional nutrients such as choline, vitamin D, and omega-3 fatty acids have also gained attention as being protective beyond normal growth and development, possessing functional effects that may be vital to future recommendations for health.

 

Screening-Level Risk Assessment of Coxiella burnetii (Q Fever) Transmission via Aeration of Drinking Water

Sales-Ortells, H., Medema, G., Environmental Science and Technology, April 2012

A screening-level risk assessment of Q fever transmission through drinking water produced from groundwater in the vicinity of infected goat barnyards that employed aeration of the water was performed. Quantitative data from scientific literature were collected and a Quantitative Microbial Risk Assessment approach was followed. An exposure model was developed to calculate the dose to which consumers of aerated groundwater are exposed through aerosols inhalation during showering. The exposure assessment and hazard characterization were integrated in a screening-level risk characterization using a dose-response model for inhalation to determine the risk of Q fever through tap water. A nominal range sensitivity analysis was performed. The estimated risk of disease was lower than 10(-4) per person per year (pppy), hence the risk of transmission of C. burnetii through inhalation of drinking water aerosols is very low. The sensitivity analysis shows that the most uncertain parameters are the aeration process, the transport of C. burnetii in bioaerosols via the air, the aerosolization of C. burnetii in the shower, and the air filtration efficiency. The risk was compared to direct airborne exposure of persons in the vicinity of infected goat farms; the relative risk of exposure through inhalation of drinking water aerosols was 0.002%.

Waterborne Pathogens: Emerging Issues in Monitoring, Treatment and Control

Reynolds, K.A., MSPH, Ph.D., Water Conditioning & Purification, March 2012

Microbial threats to water quality continue to emerge; however, technologies for monitoring, treating and controlling emerging waterborne pathogens are also evolving. Understanding the range of factors that lead to the contamination of water are important for developing appropriate tools to manage human health risks.

Health Risks of Limited-Contact Water Recreation

Dorevitch, S., et al., Environmental Health Perspectives, February 2012

Wastewater-impacted waters that do not support swimming are often used for boating, canoeing, fishing, kayaking, and rowing. Little is known about the health risks of these limited-contact water recreation activities. We evaluated the incidence of illness, severity of illness, associations between water exposure and illness, and risk of illness attributable to limited-contact water recreation on waters dominated by wastewater effluent and on waters approved for general use recreation (such as swimming). The Chicago Health, Environmental Exposure, and Recreation Study was a prospective cohort study that evaluated five health outcomes among three groups of people: those who engaged in limited-contact water recreation on effluent-dominated waters, those who engaged in limited-contact recreation on general-use waters, and those who engaged in non–water recreation. Data analysis included survival analysis, logistic regression, and estimates of risk for counterfactual exposure scenarios using G-computation. Telephone follow-up data were available for 11,297 participants. With non–water recreation as the reference group, we found that limited-contact water recreation was associated with the development of acute gastrointestinal illness in the first 3 days after water recreation at both effluent-dominated waters [adjusted odds ratio (AOR) 1.46; 95% confidence interval (CI): 1.08, 1.96] and general-use waters (1.50; 95% CI: 1.09, 2.07). For every 1,000 recreators, 13.7 (95% CI: 3.1, 24.9) and 15.1 (95% CI: 2.6, 25.7) cases of gastrointestinal illness were attributable to limited-contact recreation at effluent-dominated waters and general-use waters, respectively. Eye symptoms were associated with use of effluent-dominated waters only (AOR 1.50; 95% CI: 1.10, 2.06). Among water recreators, our results indicate that illness was associated with the amount of water exposure. Limited-contact recreation, both on effluent-dominated waters and on waters designated for general use, was associated with an elevated risk of gastrointestinal illness.

Planning for Sustainability: A Handbook for Water and Wastewater Utilities

U.S. Environmental Protection Agency, February 2012

This handbook is intended to provide information about how to enhance current planning processes by building in sustainability considerations. It is designed to be useful for various types and scales of planning efforts, such as: Long-range integrated water resource planning, Strategic planning, Capital planning, System-wide planning to meet regulatory requirements (e.g., combined sewer overflow upgrades and new stormwater permitting requirements), Specific infrastructure project planning (e.g., for repair, rehabilitation, or replacement of specific infrastructure)

Replacing caloric beverages with water or diet beverages for weight loss in adults: main results of the Choose Healthy Options Consciously Everyday (CHOICE) randomized clinical trial

Tate, D.F., et al., The American Journal of Clinical Nutrition, February 2012

Replacement of caloric beverages with noncaloric beverages may be a simple strategy for promoting modest weight reduction; however, the effectiveness of this strategy is not known. We compared the replacement of caloric beverages with water or diet beverages (DBs) as a method of weight loss over 6 mo in adults and attention controls (ACs). Results: In an intent-to-treat analysis, a significant reduction in weight and waist circumference and an improvement in systolic blood pressure were observed from 0 to 6 mo. Mean (±SEM) weight losses at 6 mo were −2.5 ± 0.45% in the DB group, −2.03 ± 0.40% in the Water group, and −1.76 ± 0.35% in the AC group; there were no significant differences between groups. The chance of achieving a 5% weight loss at 6 mo was greater in the DB group than in the AC group (OR: 2.29; 95% CI: 1.05, 5.01; P = 0.04). A significant reduction in fasting glucose at 6 mo (P = 0.019) and improved hydration at 3 (P = 0.0017) and 6 (P = 0.049) mo was observed in the Water group relative to the AC group. In a combined analysis, participants assigned to beverage replacement were 2 times as likely to have achieved a 5% weight loss (OR: 2.07; 95% CI: 1.02, 4.22; P = 0.04) than were the AC participants. Conclusions: Replacement of caloric beverages with noncaloric beverages as a weight-loss strategy resulted in average weight losses of 2% to 2.5%. This strategy could have public health significance and is a simple, straightforward message. This trial was registered at clinicaltrials.gov as NCT01017783.

Mild Dehydration Affects Mood in Healthy Young Women

Armstrong, L., et al., The Journal of Nutrition, February 2012

Limited information is available regarding the effects of mild dehydration on cognitive function. Therefore, mild dehydration was produced by intermittent moderate exercise without hyperthermia and its effects on cognitive function of women were investigated. Twenty-five females (age 23.0 ± 0.6 y) participated in three 8-h, placebo-controlled experiments involving a different hydration state each day: exercise-induced dehydration with no diuretic (DN), exercise-induced dehydration plus diuretic (DD; furosemide, 40 mg), and euhydration (EU). Cognitive performance, mood, and symptoms of dehydration were assessed during each experiment, 3 times at rest and during each of 3 exercise sessions. The DN and DD trials in which a volunteer attained a ≥1% level of dehydration were pooled and compared to that volunteer’s equivalent EU trials. Mean dehydration achieved during these DN and DD trials was −1.36 ± 0.16% of body mass. Significant adverse effects of dehydration were present at rest and during exercise for vigor-activity, fatigue-inertia, and total mood disturbance scores of the Profile of Mood States and for task difficulty, concentration, and headache as assessed by questionnaire. Most aspects of cognitive performance were not affected by dehydration. Serum osmolality, a marker of hydration, was greater in the mean of the dehydrated trials in which a ≥1% level of dehydration was achieved (P = 0.006) compared to EU. In conclusion, degraded mood, increased perception of task difficulty, lower concentration, and headache symptoms resulted from 1.36% dehydration in females. Increased emphasis on optimal hydration is warranted, especially during and after moderate exercise.

Atrazine Exposure in Public Drinking Water and Preterm Birth

Rinsky, J.L., et al., Public Health Reports, January/February 2012

Approximately 13% of all births occur prior to 37 weeks gestation in the U.S. Some established risk factors exist for preterm birth, but the etiology remains largely unknown. Recent studies have suggested an association with environmental exposures. We examined the relationship between preterm birth and exposure to a commonly used herbicide, atrazine, in drinking water. We reviewed Kentucky birth certificate data for 2004-2006 to collect duration of pregnancy and other individual-level covariates. We assessed existing data sources for atrazine levels in public drinking water for the years 2000-2008, classifying maternal county of residence into three atrazine exposure groups. We used logistic regression to analyze the relationship between atrazine exposure and preterm birth, controlling for maternal age, race/ethnicity, education, smoking, and prenatal care. An increase in the odds of preterm birth was found for women residing in the counties included in the highest atrazine exposure group compared with women residing in counties in the lowest exposure group, while controlling for covariates. Analyses using the three exposure assessment approaches produced odds ratios ranging from 1.20 (95% confidence interval [CI] 1.14, 1.27) to 1.26 (95% CI 1.19, 1.32), for the highest compared with the lowest exposure group. Suboptimal characterization of environmental exposure and variables of interest limited the analytical options of this study. Still, our findings suggest a positive association between atrazine and preterm birth, and illustrate the need for an improved assessment of environmental exposures to accurately address this important public health issue.

Source Water Protection Vision and Roadmap

Water Research Foundation, January 2012

In 2007, a group of source water protection experts met, under the auspices of the Water Research Foundation and the Water Environment Research Foundation, to develop a research agenda that would ultimately provide information to help drinking water suppliers design and implement effective source water protection programs. A key result of that effort identified the need for a national vision and roadmap that would guide U.S. water utilities and supporting groups with a unified strategy for coherent, consistent, cost-effective, and socially-acceptable source water protection programs. This brief document presents the vision and roadmap and focuses on how to move forward on source water protection. The roadmap is intended to serve as a feasible, focused path toward promoting source water protection for U.S. drinking water utilities. It is not intended to serve as an official directive, but rather is a collection of observations and recommendations organized to form a path to achieving the vision. The companion document Developing a Vision and Roadmap for Drinking Water Source Protection comprehensively covers the project team’s findings regarding the various building blocks to make source water protection a reality. That document includes an annotated bibliography of source water protection resources, a summation of a literature review, and helpful water utility case studies. Both documents are meant to be used in concert to help water utilities move forward with their source water protection efforts and proactively improve and/or maintain the quality of their drinking water sources. Source water protection has been discussed and promoted in an ad hoc fashion by different organizations at the national, regional, state, and local levels. It is essential to increase the awareness of source water protection at the national level. Education of decision makers, utility managers, stakeholders, and the general public should be the first step in moving source water protection up a path to success. Leadership is needed to make this a national priority. In order to ensure the various actions recommended in the roadmap can be carried out, it is recommended that both a top-down and a bottom-up approach be taken. A top-down approach would establish a flexible framework to guide local entities (e.g., water systems, watershed organizations, and regional planning agencies) to work together to protect source water. Due to the variability of source waters and the areas from which they are derived, along with technical, social, political, financial, and regulatory differences across jurisdictions, it is unlikely that two source water protection programs would be the same. A bottom-up approach is therefore also needed, which would use local information and broad stakeholder involvement to produce a “tailored” source water protection program that addresses unique issues at the local level.

Migration of Bisphenol-A into the Natural Spring Water Packaged in Polycarbonate Carboys

Erdem, Y.K., Furkan, A., International Journal of Applied Science and Technology, January 2012

Bisphenol-A is a widely used chemical in the structure of epoxy resins, polycarbonate packages, lacquer of metal food packages all over the world. Its weak estrogenic character and possible health effects are well known. For this reason the usage of the Bisphenol-A in food packages is limited and it’s daily intake by human is restrictly under control. The declaration of specific migration limit is 0.6 ppm, the tolerable daily intake is 0.05mg/kg body weight per day by EFSA and other authorities. The EFSA and others prevent the manufacturing and using of Bisphenol-A in baby bottles in 2010. In Turkey, the 70% of the population are living in 5 metropolitan cities and the drinking water consumption is mostly supplied by packaged drinking water industry. The household and bulk usage is covered by natural spring and natural mineral water packaged in 19 liters polycarbonate carboys. That’s why the possible migration of Bisphenol-A in drinking water packaged in polycarbonate carboys was decided to investigate. First of all, a screening test was carried out in the samples supplied by two main cities. And then 5 different trade mark packaged water samples was stored at 4, 25, and 35oC for 60 days and Bisphenol-A content was determined in given intervals. It is found that the BPA migration was detected at least 450 times lower than the specific migration limit of EFSA during 60 days storage at these conditions.

What is the cell hydration status of healthy children in the USA? Preliminary data on urine osmolality and water intake

Stookey, J.D., Brass, B., Holliday, A, Arieff, A., Public Health Nutrition, January 2012

Hyperosmotic stress on cells limits many aspects of cell function, metabolism and health. International data suggest that schoolchildren may be at risk of hyperosmotic stress on cells because of suboptimal water intake. The present study explored the cell hydration status of two samples of children in the USA. Elevated urine osmolality (>800 mmol/kg) was observed in 63 % and 66 % of participants in LA and NYC, respectively. In multivariable-adjusted logistic regression models, elevated urine osmolality was associated with not reporting intake of drinking water in the morning (LA: OR = 2·1, 95 % CI 1·2, 3·5; NYC: OR = 1·8, 95 % CI 1·0, 3·5). Although over 90 % of both samples had breakfast before giving the urine sample, 75 % did not drink water. Research is warranted to confirm these results and pursue their potential health implications.

Bottled Water & Tap Water: Just the Facts

Drinking Water Research Foundation, October 2011

The information presented in this report supports the fact that drinking water, whether from the tap or a bottle, is generally safe, and that regulatory requirements for both tap water and bottled water provide Americans with clean, safe drinking water. There are some differences in regulations for each, but those differences highlight the differences between drinking water delivered by a public water system and drinking water delivered to the consumer in a sealed container. Perhaps the most notable difference between tap water and bottled water is the method of delivery. Community water systems deliver water to consumers (businesses and private residences) through miles of underground iron (unlined and poly-lined), PVC, and lead service lines that can be subject to leakage with age of the system and accidental failures, resulting in the risk of post-treatment contamination of the water that is delivered to consumers. Bottled water is delivered to consumers in sanitary, sealed containers that were filled in a bottling facility under controlled conditions in a fill room.

The effects of water shortages on health and human development

Tarrass, F., and Benjelloun, M., Perspectives in Public Health, April 2011

Shortages of water could become a major obstacle to public health and development. Currently, the United Nations Children’s Fund (UNICEF) and the World Health Organization (WHO) estimate that 1.1 billion people lack access to a water supply and 2.6 billion people lack adequate sanitation. The global health burden associated with these conditions is staggering, with an estimated 1.6 million deaths every year from diseases associated with lack of access to safe drinking water, inadequate sanitation and poor hygiene. In this paper we review the impact of water shortages on health and human development.

Water, Water Everywhere… But, How Much Water Do We Really Need for Optimal Health and Wellness?

Rosenbloom, C. PhD, RD, CSSD, Nutrition Today, November/December 2010

Water, taken in moderation, cannot hurt anybody” is a quote attributed to US author and humorist, Mark Twain. The series of articles in this special issue suggest that water in moderation may not be enough for optimal health and wellness, and the authors push the boundaries of what is currently known about water in maintaining health and preventing disease. The Hydration for Health Conference, sponsored by Danone Waters, brought together international experts to review what is known about water consumption and health or, more appropriately, what is not known about water intake and health. One of the key drivers of the interest in water and health appears to be the global obesity epidemic affecting developed and developing countries, young and old alike, but there are other health problems that might be reduced or eliminated if optimal water consumption was known and practiced.

The Mexican Experience: From Public Health Concern Toward National Beverage Guidelines

Barquera, S. MD, PhD, Nutrition Today, November/December 2010

The paper describes the process experienced in Mexico from the characterizations of beverage consumption to the development of national beverage recommendation guidelines. Mexico is one of the countries with the highest prevalence of obesity in the world. Depending on the information source, it is often ranked as second after the United States. In addition, Mexicans are the second greatest consumers of soft drinks in the world. Currently, there is some ecological evidence that associates the trends in soft-drink consumption and overall diet with the increase in the prevalence of obesity.

Healthy Hydration for Physical Activity

Péronnet, F. PhD, Nutrition Today, November/December 2010

Water is the first ingredient of life. In the comfortable environment in which we live, with an ample supply of water, we forget that our ancestors lived in an environment where water was scarce, and the weather was hot. We therefore developed a very powerful cooling system in which water plays a major role. The importance of this system is best illustrated when we are exposed to exercise and heat, separately and even more when both are combined. In these situations, the primary way to get rid of the heat generated or received from the environment is through the secretion and evaporation of sweat, which is mainly water. Thanks to this cooling system, we can sustain prolonged exposures to heat and we can work in the heat. However, if not properly replaced, fluid loss under the form of sweat results in dehydration. This reduces the ability to regulate body temperature as well as the ability to perform exercise. Under extreme circumstances, which fortunately are not often encountered, dehydration and the increase in body temperature can result in heat stroke, which could be fatal.

 

Understanding Fluid Consumption Patterns to Improve Healthy Hydration

Le Bellego, L. PhD, et al., Nutrition Today, November/December 2010

Water is quantitatively by far the No. 1 nutrient in our diet. Of course, this can vary, depending on the amount and the quality of food and drink one consumes, but approximately 50% of what we eat and drink every day is water (CIQUAL, Table CIQUAL 2008, composition nutritionnelle des aliments, 2008, Centre d’Information sur la Qualité des Aliments, http://www.afssa.fr/TableCIQUAL/; US Department of Agriculture, Agricultural Research Service, 2005, USDA National Nutrient Database for Standard Reference, Release 18. Nutrient Data Laboratory Home Page, http://www.nal.usda.gov/fnic/foodcomp; NUTTAB, 2006, Food Standards Australia New Zealand [FSANZ], http://www.foodstandards.gov.au/monitoringandsurveillance/nuttab2006/). It is also the No. 1 component of the human body by mass. This varies from one person to another, depending on individual characteristics such as body weight, ratio between lean and adipose tissues, and physiological state (pregnancy, etc), but approximately 60% of the adult body is composed of water. [Nutr Rev 2005;63(6 pt 2):S40-S54]. Finally, no biological reaction or function in the body would be possible without water. In other words, life is not possible without water. This makes the quantity and the quality of the fluids we have to drink every day quite an important issue both nutritionally and physiologically. From this perspective, it is interesting to discuss available recommendations for water intake and their reliability. This is very challenging, because no study is available on the long-term health effects of the quantity and/or the quality of fluids ingested.

 

Role of Sugar Intake in Beverages on Overweight and Health

Lafontan, M. PhD, Nutrition Today, November/December 2010

Epidemiological data have demonstrated an association between sugar intake in beverages and overweight. Cross-sectional studies are the most common but rather limited, and a lot of points are still a matter of debate. Results of intervention trials are more promising, although they remain quite rare; they provide the best arguments to infer causality. This overview is limited to the analysis of the putative impact of sugar inclusion in beverages on health, obesity, and diabetes risk. Mechanisms of action and physiological end points are highlighted to clarify the differences existing in the health impact of various kinds of sugars. When considering weight changes and obesity-related questions related to sugar-sweetened beverages consumption, it is important to take into account population differences and genetic parameters. Lifestyle influences (eg, other components of the diet and physical activity) must also be considered in the studies.

Hydration and Human Cognition

Lieberman, H.R. PhD, Nutrition Today, November/December 2010

Although adequate hydration is essential for optimal brain function, research addressing relationships between hydration status and human behavior and cognitive function is limited. The few published studies in this area are inconclusive and contradictory. The impact of variations in hydration status, which can be substantial as humans go about their daily activities, on brain function and behavior is not known and may impact quality of life. Furthermore, vulnerable populations such as children, elderly people, and individuals with illnesses may be at higher risk of degradation in cognitive function from dehydration. A variety of difficult methodological issues have impeded progress in this area. For example, there are several methods to achieve dehydration in humans, each with different strengths and weakness. Accurately assessing and modifying human hydration status and consistently achieving desired levels of dehydration in a controlled manner are problematic. It is difficult to select appropriate behavioral tasks that detect relatively subtle changes in cognitive performance and mood resulting from moderate levels of dehydration. Generating experimental designs that include hydrated control conditions and double-blind testing poses substantial challenges to investigators. Additional well-controlled research is essential if progress is to be made and understanding gained of the effects of dehydration on cognitive function. Key elements of research should include accurate methods of assessing and modifying hydration state, an adequate number of subjects, appropriate behavioral tasks to detect subtle effects of dehydration, and inclusion of rigorous control conditions.

Drinking Water and Weight Management

Stookey, J.D. PhD, Nutrition Today, November/December 2010

This review summarizes the evidence base for recommending drinking water for weight management. Crossover experiments consistently report that drinking water results in lower total energy intake when consumed instead of caloric beverages, because individuals do not eat less food to compensate for calories in beverages. Crossover experiments also consistently report that drinking water results in greater fat oxidation compared with other beverages, because drinking water does not stimulate insulin. In intervention studies, advice to drink water is associated with reduced weight gain in children and greater weight loss in dieting adults. Although gaps in knowledge remain about specific effects of drinking water on weight loss in children and obesity prevention in adults, there is a strong evidence base for recommending drinking water for weight management.

Water Physiology: Essentiality, Metabolism, and Health Implications

Kavouras, S.A. PhD, Anastasiou, C.A. PhD, Nutrition Today, November/December 2010

Water is the most abundant molecule in the human body that undergoes continuous recycling. Numerous functions have been recognized for body water, including its function as a solvent, as a means to remove metabolic heat, and as a regulator of cell volume and overall function. Tight control mechanisms have evolved for precise control of fluid balance, indicative of its biological importance. However, water is frequently overlooked as a nutrient. This article reviews the basic elements of water physiology in relation to health, placing emphasis on the assessment of water requirements and fluid balance. Current recommendations are also discussed.

Effects of Water Consumption on Kidney Function and Excretion

Tack, I., Nutrition Today, November/December 2010

Water homeostasis depends on fluid intake and maintenance of body water balance by adjustment of renal excretion under the control of arginine vasopressin hormone. The human kidney manages more efficiently fluid excess than fluid deficit. As a result, no overhydration is observed in healthy individuals drinking a large amount of fluid, whereas a mild hydration deficit is not uncommon in small-fluid-volume (SFV) drinkers. Small-fluid-volume intake does not alter renal function but is associated with an increased risk of renal lithiasis and urinary tract infection. In that case, increasing fluid intake prevents recurrence. The benefit of increasing fluid intake in healthy SFV drinkers had never been studied until now. Two recent studies from Danone Research indicate that increasing water intake in such people leads to a significant decrease of the risk of renal stone disease (assessed by measuring Tiselius’ crystallization risk index). Because renal lithiasis and urinary tract infection prevalence are quite high in western countries, this preliminary observation supports the interest of an approach based on primary prevention using voluntary increase in water-based fluid consumption in SFV drinkers. Complementary studies are required to determine other clinical impacts of SFV intake and to evaluate the benefits of increasing fluid intake.

 

Bromate reduction in simulated gastric juice

Cotruvo, J.A., et al., e-Journal AWWA, November 2010

This article advocates for a revised risk assessment for bromate to reflect presystemic chemistry not usually considered when low-dose risks are calculated from high-dose toxicology data. Because of high acidity and the presence of reducing agents, presystemic decomposition of bromate can begin in the stomach, which should contribute to lower-than- expected doses to target organs. In this research, bromate decomposition kinetics with simulated stomach/gastric juice were studied to determine the risk of environmentally relevant exposure to bromate. The current work is the first step in a series of studies that the authors are conducting to better estimate the hypothetical low-dose risks to humans from drinking water ingestion and thus arrive at more appropriate maximum contaminant levels (MCLs). It is the authors’ belief that additional kinetics and metabolism research will demonstrate that the human risk from ingestion of compounds in drinking water is less than originally believed and will lead to MCLs and MCL goals that are more scientifically based.

Drinking Water and Risk of Stroke

Gustavo Saposnik, MD, MSc, FAHA, Stroke, October 2010

In the present issue of Stroke, the authors investigate the association between low-level arsenic exposure in drinking water and the ischemic stroke admissions in Michigan. They found that even low exposure to arsenic is associated with an increased incident risk of stroke (relative risk, 1.03; 95% CI, 1.01 to 1.05 per µg/L increase in arsenic concentration). The authors also compared whether that exposure was associated with other nonvascular conditions (hernia, duodenal ulcer) not expected to increase their risk. Comparing zip codes in Genesee County at the 90th percentile of arsenic levels (21.6 µg/L) with those at the 10th percentile (0.30 µg/L), there was a 91% increase in risk of stroke admission (relative risk, 1.91; 95% CI, 1.27 to 2.88). The results were consistent in showing an increased risk for stroke, but not for other control medical conditions (hernia and duodenal ulcer). Moreover, they found a graded effect: a higher incident risk among those individuals exposed to higher water concentrations of arsenic (Figure 2).

Association between children’s blood lead levels, lead service lines, and water disinfection

Brown, M.J., Raymond, J., Homa, D., Kennedy, C., Sinks, T., Environmental Research, October 2010

Evaluate the effect of changes in the water disinfection process, and presence of lead service lines (LSLs), on children’s blood lead levels (BLLs) in Washington, DC. Three cross-sectional analyses examined the relationship of LSL and changes in water disinfectant with BLLs in children o6 years of age. The study population was derived from the DC Childhood Lead Poisoning Prevention Program blood lead surveillance system of children who were tested and whose blood lead test results were reported to the DC Health Department. The Washington, DC Water and Sewer Authority (WASA) provided information on LSLs. The final study population consisted of 63,854 children with validated addresses. Controlling for age of housing, LSL was an independent risk factor for BLLs Z10 mg/dL, and Z5 mg/dL even during time periods whenwater levelsmet theUS Environmental Protection Agency (EPA) action level of 15 parts per billion (ppb). When chloramine alone was used to disinfect water, the risk for BLL in the highest quartile among children in homes with LSL was greater than when either chlorine or chloramine with orthophosphate was used. For children tested after LSLs in their houses were replaced, those with partially replaced LSL were 43 times as likely to have BLLs Z10 mg/dL versus children who never had LSLs. LSLs were a risk factor for elevated BLLs even when WASA met the EPA water action level. Changes in water disinfection can enhance the effect of LSLs and increase lead exposure. Partially replacing LSLs may not decrease the risk of elevated BLLs associated with LSL exposure.

When is the Next Boil Water Alert?

Water Technology, August 2010

A common theme we see on a daily basis relates to drinking water infrastructure. We track news throughout the world that impacts the drinking water industry, and one of the most frequent things we see are notices from agencies and organizations about the need for communities to boil water in order to combat possible contamination. In some parts of the world, boiling water is the norm due to water supply issues. Often, these areas may be limited in their ability to develop economically, as clean water is such an integral part of daily life. It is in the developed world, however, where we have been seeing a large increase in the number of such notices.

Climate Change, Water, and Risk: Current Water Demands Are Not Sustainable

www.nrdc.org, July 2010

Climate change will have a significant impact on the sustainability of water supplies in the coming decades. A new analysis, performed by consulting firm Tetra Tech for the Natural Resources Defense Council (NRDC), examined the effects of global warming on water supply and demand in the contiguous United States. The study found that more than 1,100 counties— one-third of all counties in the lower 48—will face higher risks of water shortages by mid-century as the result of global warming. More than 400 of these counties will face extremely high risks of water shortages.

Scientific Opinion on Dietary Reference Values for water

European Food Safety Authority (EFSA), EFSA Journal, March 2010

This Opinion of the EFSA Panel on Dietetic Products, Nutrition, and Allergies (NDA) deals with the setting of dietary reference values for water for specific age groups. Adequate Intakes (AI) have been defined derived from a combination of observed intakes in population groups with desirable osmolarity values of urine and desirable water volumes per energy unit consumed. The reference values for total water intake include water from drinking water, beverages of all kind, and from food moisture and only apply to conditions of moderate environmental temperature and moderate physical activity levels (PAL 1.6). AIs for infants in the first half of the first year of life are estimated to be 100-190 mL/kg per day. For infants 6-12 months of age a total water intake of 800-1000 mL/day is considered adequate. For the second year of life an adequate total water intake of 1100-1200 mL/day is defined by interpolation, as intake data are not available. AIs of water for children are estimated to be 1300 mL/day for boys and girls 2-3 years of age; 1600 mL/day for boys and girls 4-8 years of age; 2100 mL/day for boys 9-13 years of age; 1900 mL/day for girls 9-13 years of age. Adolescents of 14 years and older are considered as adults with respect to adequate water intake. Available data for adults permit the definition of AIs as 2.0 L/day (P 95 3.1 L) for females and 2.5 L/day (P95 4.0 L) for males. The same AIs as for adults are defined for the elderly. For pregnant women the same water intake as in non-pregnant women plus an increase in proportion to the increase in energy intake (300 mL/day) is proposed. For lactating women adequate water intakes of about 700 mL/day above the AIs of non-lactating women of the same age are derived.

Water as an essential nutrient: the physiological basis of hydration

Jéquier, E. and Constant, F., EJCN – European Journal of Clinical Nutrition, September 2009

How much water we really need depends on water functions and the mechanisms of daily water balance regulation. The aim of this review is to describe the physiology of water balance and consequently to highlight the new recommendations with regard to water requirements. Water has numerous roles in the human body. It acts as a building material; as a solvent, reaction medium and reactant; as a carrier for nutrients and waste products; in thermoregulation; and as a lubricant and shock absorber. The regulation of water balance is very precise, as a loss of 1% of body water is usually compensated within 24 h. Both water intake and water losses are controlled to reach water balance. Minute changes in plasma osmolarity are the main factors that trigger these homeostatic mechanisms. Healthy adults regulate water balance with precision, but young infants and elderly people are at greater risk of dehydration. Dehydration can affect consciousness and can induce speech incoherence, extremity weakness, hypotonia of ocular globes, orthostatic hypotension and tachycardia. Human water requirements are not based on a minimal intake because it might lead to a water deficit due to numerous factors that modify water needs (climate, physical activity, diet and so on). Water needs are based on experimentally derived intake levels that are expected to meet the nutritional adequacy of a healthy population. The regulation of water balance is essential for the maintenance of health and life. On an average, a sedentary adult should drink 1.5 l of water per day, as water is the only liquid nutrient that is really essential for body hydration.

Water Disinfection By-Products and the Risk of Specific Birth Defects: A Population-Based Cross-Sectional Study in Taiwan

Hwang, B.-F., Jaakkola, J., Guo, H.-R., Environmental Health,  June 2008

Recent findings suggest that exposure to disinfection by-products may increase the risk of birth defects. Previous studies have focused mainly on birth defects in general or groups of defects. The objective of the present study was to assess the effect of water disinfection by-products on the risk of most common specific birth defects. We conducted a population-based cross-sectional study of 396,049 Taiwanese births in 2001-2003 using information from the Birth Registry and Waterworks Registry. We compared the risk of eleven most common specific defects in four disinfection by-product exposure categories based on the levels of total trihalomethanes (TTHMs) representing high (TTHMs 20+ ug/L), medium (TTHMs 10-19 ug/L), low exposure (TTHMs 5-9 ug/L), and 0-4 ug/L as the reference category. In addition, we conducted a meta-analysis of the results from the present and previous studies focusing on the same birth defects.

Maternal Exposure to Water Disinfection By-products During Gestation and Risk of Hypospadias

Luben, T.J., Nuckols, J.R., Mosley, B.S., Hobbs, C., Reif, J.S., Occupational and Environmental Medicine, June 2008

The use of chlorine for water disinfection results in the formation of numerous contaminants called disinfection by-products (DBPs), which may be associated with birth defects, including urinary tract defects. We used Arkansas birth records (1998-2002) to conduct a population-based case-control study investigating the relationship between hypospadias and two classes of DBPs, trihalomethanes (THM) and haloacetic acids (HAA). We utilised monitoring data, spline regression and geographical information systems (GIS) to link daily concentrations of these DBPs from 263 water utilities to 320 cases and 614 controls. We calculated ORs for hypospadias and exposure to DBPs between 6 and 16 weeks’ gestation, and conducted subset analyses for exposure from ingestion, and metrics incorporating consumption, showering and bathing. We found no increase in risk when women in the highest tertiles of exposure were compared to those in the lowest for any DBP. When ingestion alone was used to assess exposure among a subset of 40 cases and 243 controls, the intermediate tertiles of exposure to total THM and the five most common HAA had ORs of 2.11 (95% CI 0.89 to 5.00) and 2.45 (95% CI 1.06 to 5.67), respectively, compared to women with no exposure. When exposure to total THM from consumption, showering and bathing exposures was evaluated, we found an OR of 1.96 (95% CI 0.65 to 6.42) for the highest tertile of exposure and weak evidence of a dose-response relationship. Our results provide little evidence for a positive relationship between DBP exposure during gestation and an increased risk of hypospadias but emphasize the necessity of including individual-level data when assessing exposure to DBPs.

Formation of N-Nitrosamines from Eleven Disinfection Treatments of Seven Different Surface Waters

Zhao, Y.-Y., et al., Environmental Science & Technology, May 2008

Formation of nine N-nitrosamines has been investigated when seven different source waters representing various qualities were each treated with eleven bench-scale disinfection processes, without addition of nitrosamine precursors. These disinfection treatments included chlorine (OCl-) chloramine (NH2Cl), chlorine dioxide (ClO2), ozone (O3), ultraviolet (UV), advanced oxidation processes (AOP), and combinations. The total organic carbon (TOC) of the seven source waters ranged from 2 to 24 mg L-1. The disinfected water samples and the untreated source waters were analyzed for nine nitrosamines using a solid phase extraction and liquid chromatography-tandem mass spectrometry method. Prior to any treatment, N-nitrosodimethylamine (NDMA) was detected ranging from 0 to 53 ng L-1 in six of the seven source waters, and its concentrations increased in the disinfected water samples (0 – 118 ng L-1). N-nitrosodiethylamine (NDEA), N-nitrosomorpholine (NMor), and N-nitrosodiphenylamine (NDPhA) were also identified in some of the disinfected water samples. NDPhA (0.2- 0.6 ng L-1) was formed after disinfection with OCl-, NH2Cl, O3, and MPUV/OCl-. NMEA was produced with OCl- and MPUV/OCl-, and NMor formation was associated with O3. In addition, UV treatment alone degraded NDMA; however, UV/OCl- and AOP/OCl- treatments produced higher amounts of NDMA compared to UV and AOP alone, respectively. These results suggest that UV degradation or AOP oxidation treatment may provide a source of NDMA precursors. This study demonstrates that environmental concentrations and mixtures of unknown nitrosamine precursors in source waters can form NDMA and other nitrosamines.

N,N-Dimethylsulfamide as Precursor for N-Nitrosodimethylamine (NDMA) Formation upon Ozonation and its Fate During Drinking Water Treatment

Schmidt, C.K., Brauch, H.-J., Environmental Science & Technology, April 2008

Application and microbial degradation of the fungicide tolylfluanide gives rise to a new decomposition product named N,N-dimethylsulfamide (DMS). In Germany, DMS was found in groundwaters and surface waters with typical concentrations in the range of 100-1000 ng/L and 50-90 ng/L, respectively. Laboratory-scale and field investigations concerning its fate during drinking water treatment showed that DMS cannot be removed via riverbank filtration, activated carbon filtration, flocculation, and oxidation or disinfection procedures based on hydrogen peroxide, potassium permanganate, chlorine dioxide, or UV irradiation. Even nanofiltration does not provide a sufficient removal efficiency. During ozonation about 30-50% of DMS are converted to the carcinogenic N-nitrosodimethylamine (NDMA). The NDMA being formed is biodegradable and can at least partially be removed by subsequent biologically active drinking water treatment steps including sand or activated carbon filtration. Disinfection with hypochlorous acid converts DMS to so far unknown degradation products but not to NDMA or 1,1-dimethylhydrazine (UDMH).

 

Risk of Birth Defects in Australian Communities with High Brominated Disinfection By-product Levels

Chisholm, K., et al., Environmental Health Perspective, April 2008

By international standards, water supplies in Perth, Western Australia, contain high trihalomethane (THM) levels, particularly the brominated forms. Geographic variability in these levels provided an opportunity to examine cross-city spatial relationships between THM exposure and rates of birth defects (BDs).Our goal was to examine BD rates by exposure to THMs with a highly brominated fraction in metropolitan locations in Perth, Western Australia. We collected water samples from 47 separate locations and analyzed them for total and individual THM concentrations (micrograms per liter), including separation into brominated forms. We classified collection areas by total THM (TTHM) concentration: low (< 60 microg/L), medium (> 60 to < 130 microg/L), and high (> or = 130 microg/L). We also obtained deidentified registry-based data on total births and BDs (2000-2004 inclusive) from post codes corresponding to water sample collection sites and used binomial logistic regression to compare the frequency of BDs aggregately and separately for the TTHM exposure groups, adjusting for maternal age and socioeconomic status. Total THMs ranged from 36 to 190 microg/L. A high proportion of the THMs were brominated (on average, 92%). Women living in high-TTHM areas showed an increased risk of any BD [odds ratio (OR) = 1.22; 95% confidence interval (CI), 1.01-1.48] and for the major category of any cardiovascular BD (OR = 1.62; 95% CI, 1.04-2.51), compared with women living in low-TTHM areas. Brominated forms constituted the significant fraction of THMs in all areas. Small but statistically significant increases in risks of BDs were associated with residence in areas with high THMs.

EPA – FACTOIDS: Drinking Water and Ground Water Statistics for 2007

U.S. Environmental Protection Agency, March, 2008

There are approximately 156,000 public drinking water systems in the United States. Each of these systems regularly supplies drinking water to at least 25 people or 15 service connections. Beyond their common purpose, the 156,000 systems vary widely. The following tables group water systems into categories that show their similarities and differences. For example, the first table shows that most people in the US (286 million) get their water from a community water system. There are approximately 52,000 community water systems, but just eight percent of those systems (4,048) serve 82 percent of the people. The second table shows that more water systems have groundwater than surface water as a source–but more people drink from a surface water system. Other tables break down these national numbers by state, territory, and EPA region.

This package also contains figures on the types and locations of underground injection control wells. EPA and states regulate the placement and operation of these wells to ensure that they do not threaten underground sources of drinking water. The underground injection control program statistics are based on separate reporting from the states to EPA. The drinking water system statistics on the following pages are taken from the Safe Drinking Water Information System/Federal version (SDWIS/Fed). SDWIS/Fed is the U.S. Environmental Protection Agency’s official record of public drinking water systems, their violations of state and EPA regulations, and enforcement actions taken by EPA or states as a result of those violations. EPA maintains the database using information collected and submitted by the states. Notice: Compliance statistics are based on violations reported by states to the EPA Safe Drinking Water Information System. EPA is aware of inaccuracies and underreporting of some data in this system. We are working with the states to improve the quality of the data. Read an analysis of SDWIS/Fed data quality and get more information and additional drinking water data tables.

Human Health Risk Assessment of Chlorinated Disinfection By-products in Drinking Water Using a Probabilistic Approach

Hamidin, N., Yu, Q.J., Connell, D.W., Water Research, March 2008

The presence of chlorinated disinfection by-products (DBPs) in drinking water is a public health issue, due to their possible adverse health effects on humans. To gauge the risk of chlorinated DBPs on human health, a risk assessment of chloroform (trichloromethane (TCM)), bromodichloromethane (BDCM), dibromochloromethane (DBCM), bromoform (tribromomethane (TBM)), dichloroacetic acid (DCAA) and trichloroacetic acid (TCAA) in drinking water was carried out using probabilistic techniques. Literature data on exposure concentrations from more than 15 different countries and adverse health effects on test animals as well as human epidemiological studies were used. The risk assessment showed no overlap between the highest human exposure dose (EXP(D)) and the lowest human equivalent dose (HED) from animal test data, for TCM, BDCM, DBCM, TBM, DCAA and TCAA. All the HED values were approximately 10(4)-10(5) times higher than the 95th percentiles of EXP(D). However, from the human epidemiology data, there was a positive overlap between the highest EXP(D) and the lifetime average daily doses (LADD(H)) for TCM, BDCM, DCAA and TCAA. This suggests that there are possible adverse health risks such as a small increased incidence of cancers in males and developmental effects on infants. However, the epidemiological data comprised several risk factors and exposure classification levels which may affect the overall results.

Drinking Water Disinfection By-Products and Time to Pregnancy

Maclehose, R.F., Savitz, D.A., Herring, A.H., Hartmann, K.E., Singer, P.C., Weinberg, H.S., Epidemiology, March 2008

Laboratory evidence suggests tap water disinfection by-products (DBPs) could have an effect very early in pregnancy, typically before clinical detectability. Undetected early losses would be expected to increase the reported number of cycles to clinical pregnancy. We investigated the association between specific DBPs (trihalomethanes, haloacetic acids, brominated-trihalomethanes, brominated-haloacetic acids, total organic halides, and bromodichloromethane) and time to pregnancy among women who enrolled in a study of drinking water and reproductive outcomes. We quantified exposure to DBPs through concentrations in tap water, quantity ingested through drinking, quantity inhaled or absorbed while showering or bathing, and total integrated exposure. The effect of DBPs on time to pregnancy was estimated using a discrete time hazard model. Overall, we found no evidence of an increased time to pregnancy among women who were exposed to higher levels of DBPs. A modestly decreased time to pregnancy (ie, increased fecundability) was seen among those exposed to the highest level of ingested DBPs, but not for tap water concentration, the amount absorbed while showering or bathing, or the integrated exposure. Our findings extend those of a recently published study suggesting a lack of association between DBPs and pregnancy loss.

Risk of waterborne illness via drinking water in the United States

Reynolds, K.A., Mena, K.D., Gerba, C.P., Reviews of Environmental Contamination & Toxicology, January 2008

Outbreaks of disease attributable to drinking water are not common in the U.S., but they do still occur and can lead to serious acute, chronic, or sometimes fatal health consequences, particularly in sensitive and immunocompromised populations. From 1971 to 2002, there were 764 documented waterborne outbreaks associated with drinking water, resulting in 575,457 cases of illness and 79 deaths (Blackburn et al. 2004; Calderon 2004); however, the true impact of disease is estimated to be much higher. If properly applied, current protocols in municipal water treatment are effective at eliminating pathogens from water. However, inadequate, interrupted, or intermittent treatment has repeatedly been associated with waterborne disease outbreaks. Contamination is not evenly distributed but rather affected by the number of pathogens in the source water, the age of the distribution system, the quality of the delivered water, and climatic events that can tax treatment plant operations. Private water supplies are not regulated by the USEPA and are generally not treated or monitored, although very few of the municipal systems involved in documented outbreaks exceeded the USEPA’s total coliform standard in the preceding 12 mon (Craun et al. 2002). We provide here estimates of waterborne infection and illness risks in the U.S. based on the total number of water systems, source water type, and total populations exposed. Furthermore, we evaluated all possible illnesses associated with the microbial infection and not just gastroenteritis. Our results indicate that 10.7 M infections/yr and 5.4 M illnesses/yr occur in populations served by community groundwater systems; 2.2 M infections/yr and 1.1 M illnesses/yr occur in noncommunity groundwater systems; and 26.0 M infections/yr and 13.0 M illnesses/yr occur in municipal surface water systems. The total estimated number of waterborne illnesses/yr in the U.S. is therefore estimated to be 19.5 M/yr. Others have recently estimated waterborne illness rates of 12M cases/yr (Colford et al. 2006) and 16 M cases/yr (Messner et al. 2006), yet our estimate considers all health outcomes associated with exposure to pathogens in drinking water rather than only gastrointestinal illness. Drinking water outbreaks exemplify known breaches in municipal water treatment and distribution processes and the failure of regulatory requirements to ensure water that is free of human pathogens. Water purification technologies applied at the point-of-use (POU) can be effective for limiting the effects of source water contamination, treatment plant inadequacies, minor intrusions in the distribution system, or deliberate posttreatment acts (i.e., bioterrorism). Epidemiological studies are conflicting on the benefits of POU water treatment. One prospective intervention study found that consumers of reverse-osmosis (POU) filtered water had 20%-35% less gastrointestinal illnesses than those consuming regular tap water, with an excess of 14% of illness due to contaminants introduced in the distribution system (Payment 1991, 1997). Two other studies using randomized, blinded, controlled trials determined that the risks were equal among groups supplied with POU-treated water compared to untreated tap water (Hellard et al. 2001; Colford et al. 2003). For immunocompromised populations, POU water treatment devices are recommended by the CDC and USEPA as one treatment option for reducing risks of Cryptosporidium and other types of infectious agents transmitted by drinking water. Other populations, including those experiencing “normal” life stages such as pregnancy, or those very young or very old, might also benefit from the utilization of additional water treatment options beyond the current multibarrier approach of municipal water treatment.

Massive Microbiological Groundwater Contamination Associated with a Waterborne Outbreak in Lake Erie, South Bass Island, Ohio

Fong, T.-T., et al., Environmental Health Perspectives, June 2007

A groundwater-associated outbreak affected approximately 1,450 residents and visitors of South Bass Island, Ohio, between July and September 2004. To examine the microbiological quality of groundwater wells located on South Bass Island, we sampled 16 wells that provide potable water to public water systems 15–21 September 2004. We tested groundwater wells for fecal indicators, enteric viruses and bacteria, and protozoa (Cryptosporidium and Giardia). The hydrodynamics of Lake Erie were examined to explore the possible surface water–groundwater interactions. All wells were positive for both total coliform and Escherichia coli. Seven wells tested positive for enterococci and Arcobacter (an emerging bacterial pathogen), and F+-specific coliphage was present in four wells. Three wells were positive for all three bacterial indicators, coliphages, and Arcobacter; adenovirus DNA was recovered from two of these wells. We found a cluster of the most contaminated wells at the southeast side of the island. Conclusions: Massive groundwater contamination on the island was likely caused by transport of microbiological contaminants from wastewater treatment facilities and septic tanks to the lake and the subsurface, after extreme precipitation events in May–July 2004. This likely raised the water table, saturated the subsurface, and along with very strong Lake Erie currents on 24 July, forced a surge in water levels and rapid surface water–groundwater interchange throughout the island. Landsat images showed massive influx of organic material and turbidity surrounding the island before the peak of the outbreak. These combinations of factors and information can be used to examine vulnerabilities in other coastal systems. Both wastewater and drinking water issues are now being addressed by the Ohio Environmental Protection Agency and the Ohio Department of Health.

Analysis of Compliance and Characterization of Violations of the Total Coliform Rule

U.S. Environmental Protection Agency, April 2007

Total coliforms have long been used in drinking water regulations as an indicator of the adequacy of water treatment and the integrity of the distribution system. Total coliforms are a group of closely related bacteria that are generally harmless. In drinking water systems, total coliforms react to treatment in a manner similar to most bacterial pathogens and many viral pathogens. Thus, the presence of total coliforms in the distribution system can indicate that the system in also vulnerable to the presence of pathogens in the system. (EPA, June 2001, page 7) Total coliforms are the indicators used in the existing Total Coliform Rule (TCR). EPA is undertaking “a rulemaking process to initiate possible revisions to the TCR. As part of this process, EPA believes it may be appropriate to include this rulemaking in a wider effort to review and address broader issues associated with drinking water distribution systems.” (see Federal Register 68 FR 19030 and 68 FR 42907). Since the promulgation of the TCR, EPA has received stakeholder feedback suggesting modifications to the TCR to reduce the implementation burden. The purpose of this paper is to provide information on the number and frequency of violations of the TCR and to further characterize the frequency with which different types and sizes of systems incur violations. Although EPA explores some statistical testing in this paper, the paper concentrates on presenting the data, as it is, in SDWIS/FED. Information on these frequencies will be useful in supporting several EPA initiatives, particularly the effort to review and possibly revise the TCR. This paper has been undertaken as part of the review of the TCR.

 

Water Quality Control in Premise Plumbing

Reynolds, K.A., Water Conditioning and Purification, February 2007

The quality of water at the end use is impacted by numerous and varied factors including source water type and quality, age of the distribution system, climatic events and even consumer use patterns. Therefore, providing high-quality drinking water at the tap requires a multi-barrier approach aimed at source water protection, source water treatment and reliable distribution. Each of these steps is monitored and controlled by municipal water treatment standards and guidelines; however, what happens to the water quality beyond the service connection at individual sites is not as well understood. New reports of water quality deterioration in the plumbing of residential or commercial buildings, known as premise plumbing, pose a question: Just what is present in our pipes?

 

Drowning in Disinfection Byproducts? Assessing Swimming Pool Water

DeMarini, D.M., et al., Environmental Science & Technology, January 2007

Disinfection is mandatory for swimming pools: public pools are usually disinfected by gaseous chlorine or sodium hypochlorite and cartridge filters; home pools typically use stabilized chlorine. These methods produce a variety of disinfection byproducts (DBPs), such as trihalomethanes (THMs), which are regulated carcinogenic DBPs in drinking water that have been detected in the blood and breath of swimmers and of nonswimmers at indoor pools. Also produced are halogenated acetic acids (HAAs) and haloketones, which irritate the eyes, skin, and mucous membranes; trichloramine, which is linked with swimming-pool-associated asthma; and halogenated derivatives of UV sun screens, some of which show endocrine effects. Precursors of DBPs include human body substances, chemicals used in cosmetics and sun screens, and natural organic matter. Analytical research has focused also on the identification of an additional portion of unknown DBPs using gas chromatography (GC)/mass spectrometry (MS) and liquid chromatography (LC)/MS/MS with derivatization. Children swimmers have an increased risk of developing asthma and infections of the respiratory tract and ear. A 1.6-2.0-fold increased risk for bladder cancer has been associated with swimming or showering/bathing with chlorinated water. Bladder cancer risk from THM exposure (all routes combined) was greatest among those with the GSTT1-1 gene. This suggests a mechanism involving distribution of THMs to the bladder by dermal/inhalation exposure and activation there by GSTT1-1 to mutagens. DBPs may be reduced by engineering and behavioral means, such as applying new oxidation and filtration methods, reducing bromide and iodide in the source water, increasing air circulation in indoor pools, and assuring the cleanliness of swimmers. The positive health effects gained by swimming can be increased by reducing the potential adverse health risks.

An approach for developing a national estimate of waterborne disease due to drinking water and a national estimate model application

Messner, M., et al., Journal of Water and Health,  04.suppl 2, July 2006

In this paper, the US Environmental Protection Agency (EPA) presents an approach and a national estimate of drinking water related endemic acute gastrointestinal illness (AGI) that uses information from epidemiologic studies. There have been a limited number of epidemiologic studies that have measured waterborne disease occurrence in the United States. For this analysis, we assume that certain unknown incidence of AGI in each public drinking water system is due to drinking water and that a statistical distribution of the different incidence rates for the population served by each system can be estimated to inform a mean national estimate of AGI illness due to drinking water. Data from public water systems suggest that the incidence rate of AGI due to drinking water may vary by several orders of magnitude. In addition, data from epidemiologic studies show AGI incidence due to drinking water ranging from essentially none (or less than the study detection level) to a rate of 0.26 cases per person-year. Considering these two perspectives collectively, and associated uncertainties, EPA has developed an analytical approach and model for generating a national estimate of annual AGI illness due to drinking water. EPA developed a national estimate of waterborne disease to address, in part, the 1996 Safe Drinking Water Act Amendments. The national estimate uses best available science, but also recognizes gaps in the data to support some of the model assumptions and uncertainties in the estimate. Based on the model presented, EPA estimates a mean incidence of AGI attributable to drinking water of 0.06 cases per year (with a 95% credible interval of 0.02–0.12). The mean estimate represents approximately 8.5% of cases of AGI illness due to all causes among the population served by community water systems. The estimated incidence translates to 16.4 million cases/year among the same population. The estimate illustrates the potential usefulness and challenges of the approach, and provides a focus for discussions of data needs and future study designs. Areas of major uncertainty that currently limit the usefulness of the approach are discussed in the context of the estimate analysis.

Tap Water Linked to Increase in Bladder Cancer

Reynolds, K.A., Water Conditioning & Purification, July 2006

As water treatment professionals, maybe you’ve been alerted to news stories suggesting a connection between tap water consumption and bladder cancer, but are these headlines true or just media hype? Although the most recently reported association of tap water consumption with bladder cancer is indeed based on numerous epidemiological studies with an international scope, all scientific research must be carefully evaluated; not just in terms of the data found, but also for the information possibly missed. The study that has everyone talking again about tap water consumption and its relationship to bladder cancer was published in the International Journal of Cancer (April 2006). Looking at data from six epidemiological studies, conducted in five countries worldwide (Canada, Finland, France, Italy and two in the United States), a significant association was found between tap water consumption and bladder cancer among men. The risk increased with consumption of greater volumes, suggesting that carcinogenic chemicals in tap water were responsible for the increased risk. While the information presented appears to be sound, it is important to understand the limitations of the study approach so that the data can be appropriately analyzed with respect to public health significance.

Despite a gender bias and inconsistent reports in the historical literature, this study seems to have sturdy legs to stand on or to at least justify continued research. As mentioned earlier, epidemiology is not a very sensitive science and is complicated by unknown confounders. In addition, this study provides no evidence as to what specific factors related to tap water are causing an increase in cancer, where other drinking water sources (i.e., bottled water) show no association. Water is clearly a heterogeneous mix of contaminants, with vast geographical and temporal fluctuations. Little is known about the combined effects of multiple contaminants found in drinking water, thus a study of single contaminants and their association with cancer risks would not provide a complete picture of overall exposures.

Volatile Organic Compounds in the Nation’s Drinking-Water Supply Wells – What Findings May Mean to Human Health

U.S. Geological Survey, June 2006

When volatile organic compounds (VOCs) are detected in samples from drinking-water supply wells, it is important to understand what these results may mean to human health. As a first step toward understanding VOC occurrence in the context of human health, a screening-level assessment was conducted by comparing VOC concentrations to human-health benchmarks. One sample from each of 3,497 domestic and public wells was analyzed for 55 VOCs; samples were collected prior to treatment or blending. At least one VOC was detected in 623 well samples (about 18 percent of all well samples) at a threshold of 0.2 part per billion. Eight of the 55 VOCs had concentrations greater than human-health benchmarks in 45 well samples (about 1 percent of all well samples); these concentrations may be of potential human-health concern if the water were to be ingested without treatment for many years. VOC concentrations were less than human-health benchmarks in most well samples with VOC detections, indicating that adverse effects are unlikely to occur, even if water with such concentrations were to be ingested over a lifetime. Seventeen VOCs may warrant further investigation because their concentrations were greater than, or approached, human-health benchmarks.

An Approach for Developing a National Estimate Of Waterborne Disease Due to Drinking Water and a National Estimate Model Application

Michael Messner, Susan Shaw, Stig Regli, Ken Rotert, Valerie Blank and Jeff Soller, Journal of Water and Health, 2006;04.suppl2:201-40

In this paper, the US Environmental Protection Agency (EPA) presents an approach and a national estimate of drinking water related endemic acute gastrointestinal illness (AGI) that uses information from epidemiologic studies. There have been a limited number of epidemiologic studies that have measured waterborne disease occurrence in the United States. For this analysis, we assume that certain unknown incidence of AGI in each public drinking water system is due to drinking water and that a statistical distribution of the different incidence rates for the population served by each system can be estimated to inform a mean national estimate of AGI illness due to drinking water. Data from public water systems suggest that the incidence rate of AGI due to drinking water may vary by several orders of magnitude. In addition, data from epidemiologic studies show AGI incidence due to drinking water ranging from essentially none (or less than the study detection level) to a rate of 0.26 cases per person-year. Considering these two perspectives collectively, and associated uncertainties, EPA has developed an analytical approach and model for generating a national estimate of annual AGI illness due to drinking water. EPA developed a national estimate of waterborne disease to address, in part, the 1996 Safe Drinking Water Act Amendments. The national estimate uses best available science, but also recognizes gaps in the data to support some of the model assumptions and uncertainties in the estimate. Based on the model presented, EPA estimates a mean incidence of AGI attributable to drinking water of 0.06 cases per year (with a 95% credible interval of 0.02–0.12). The mean estimate represents approximately 8.5% of cases of AGI illness due to all causes among the population served by community water systems. The estimated incidence translates to 16.4 million cases/year among the same population. The estimate illustrates the potential usefulness and challenges of the approach, and provides a focus for discussions of data needs and future study designs. Areas of major uncertainty that currently limit the usefulness of the approach are discussed in the context of the estimate analysis.

Analysis of Bromate and Bromide in Blood

Quinones, O., Snyder, S.A., Cotruvo, J.A., Fisher, J.W., Toxicology, April 2006

Bromate is a regulated disinfection byproduct primarily associated with the ozonation of water containing bromide, but also is a byproduct of hypochlorite used to disinfect water. To study the pharmacokinetics of bromate, it is necessary to develop a robust and sensitive analytical method for the identification and quantitation of bromate in blood. A critical issue is the extent to which bromate is degraded presystemically and in blood at low (environmentally relevant) doses of ingested bromate as it is delivered to target tissue. A simple isolation procedure was developed using blood plasma spiked with various levels of bromate and bromide. Blood proteins and lipids were precipitated from plasma using acetonitrile. The resulting extracts were analyzed by ion-chromatography with inductively-coupled plasma mass spectrometry (IC-ICP/MS), with a method reporting limit of 5 ng/mL plasma for both bromate and bromide. Plasma samples purchased commercially were spiked with bromate and stored up to 7 days. Over the 7 day storage period, bromate decay remained under 20% for two spike doses. Decay studies in plasma samples from spiked blood drawn from live rats showed significant bromate decay within short periods of time preceding sample freezing, although samples which were spiked, centrifuged and frozen immediately after drawing yielded excellent analytical recoveries.

Research Strategy for Developing Key Information on Bromate’s Mode of Action

Bull, R.J. and Cotruvo, J.A., Toxicology, April 2006

Bromate is produced when ozone is used to treat waters that contain trace amounts of bromide ion. It is also a contaminant of hypochlorite solutions produced by electrolysis of salt that contains bromide. Both ozone and hypochlorite are extensively used to disinfect drinking water, a process that is credited with reducing the incidence of waterborne infections diseases around the world. In studies on experimental animals, bromate has been consistently demonstrated to induce cancer, although there is evidence of substantial species differences in sensitivity (rat > mouse > hamster). There are no data to indicate bromate is carcinogenic in humans. An issue that is critical to the continued use of ozone as a disinfectant for drinking water in bromide-containing waters depends heavily on whether current predictions of carcinogenic risk based on carcinogenic responses in male rats treated with bromate are accurate at the much lower exposure levels of humans. Thiol-dependent oxidative damage to guanine in DNA is a plausible mode of action for bromate-induced cancer. However, other mechanisms may contribute to the response, including the accumulation of α2u-globulin in the kidney of the male rat. To provide direction to institutions that have an interest in clarifying the toxicological risks that bromate in drinking water might pose, a workshop funded by the Awwa Research Foundation was convened to lay out a research strategy that, if implemented, could clarify this important public health issue. The technical issues that underlie the deliberations of the workshop are provided in a series of technical papers. The present manuscript summarizes the conclusions of the workgroup with respect to the type and timing of research that should be conducted. The research approach is outlined in four distinct phases that lay out alternative directions as the research plan is implemented. Phase I is designed to quantify pre-systemic degradation, absorption, distribution, and metabolism of bromate and to associate these with key events for the induction of cancer and develop an initial pharmacokinetic (PK) model based on preliminary studies. Phase II will be implemented if it appears that there is a linear relationship between external dose and key event responses and is designed to gather carcinogenesis data in female rats in the absence of α2u-globulin-induced nephropathy which the workgroup concluded was a probable contributor to the responses observed in the male rats for which detailed dose–response data were collected. If the key events and external dosimetry are found not to be linear in Phase I, Phase III is initiated with a screening study of the auditory toxicity of bromate to determine if it is likely to be exacerbated by chronic exposure. If this occurs, auditory toxicity will be further evaluated in Phase IV. If auditory toxicity is determined unlikely to occur, an alternative chronic study in female rats to the one identified in Phase II will be implemented to include exposure in utero. This was recommended to address the possibility that the fetus may be more susceptible. One of the three options are to be implemented in Phase IV depending upon whether preliminary data indicated that chronic auditory toxicity, reproductive and/or developmental toxicities, or a combination of these outcomes is necessary to characterize the toxicology of low dose exposures to bromate. Each phase of the research will be accompanied by further development of pharmacokinetic models to guide collection of appropriate data to meet the needs of the more sophisticated studies. It is suggested that a Bayesian approach be utilized to develop a final risk model based upon measurement of prior observations from the Phase I studies and the set of posterior observations that would be obtained from whichever chronic study is conducted.

  • Bromate;
  • Research to improve risk assessment;
  • Drinking water

Experimental Results from the Reaction of Bromate Ion with Synthetic and Real Gastric Juices

Keith, J.D., Pacey, G.E., Cotruvo, J.A., Gordon,G., Toxicology, February 2006

This study was designed to identify and quantify the effects of reducing agents on the rate of bromate ion reduction in real and synthetic gastric juice. This could be the first element in the sequence of a pharmacokinetic description of the fate of bromate ion entering the organism, being metabolized, and subsequently being tracked through the system to the target cell or eliminated. Synthetic gastric juice containing H+ and Cl did exhibit reduced bromate ion levels, but at a rate that was too slow for a significant amount of bromate to be reduced under typical stomach retention time conditions. The reaction orders for Cl and H+ were 1.50 and 2.0, respectively. Addition of the reducing agents hydrogen sulfide (which was shown to be present and quantified in real gastric juice), glutathione, and/or cysteine increased the rate of bromate ion loss. All of the reactions showed significant pH effects. Half-lives as short as 2 min were measured for bromate ion reduction in 0.17 M H+ and Cl and 10−4 M H2S. Therefore, the lifetime of bromate ion in solutions containing typical gastric juice concentrations of H+, Cl, and H2S is 20–30 min. This rate should result in as much as a 99% reduction of bromate ion during its residence in the stomach. Bromate ion reduction in real gastric juice occurred at a rapid rate. A comparison of real and synthetic gastric juice containing H+, Cl, cysteine, glutathione, and hydrogen sulfide showed that the component most responsible for the considerable decrease of the concentration of bromate ion in the stomach is hydrogen sulfide.

  • Bromate;
  • Gastric juice;
  • Ion chromatography;
  • Hydrogen sulfide

Bottled Water Production in the United States: How Much Ground Water is Actually Being Used?

Keith N. Eshelman, Ph.D. Associate Professor, University of Maryland, Center for Environmental Studies, May 2005

A comprehensive, quantitative survey of bottled water producers in the U.S. that reveals data collected on bottled water production, specifically production from ground water, the primary source of bottled water.Relative to other uses of ground water, bottled water production was found to be a de minimus user of ground water.

Analysis of the February 1999 Natural Resources Defense Council Report on Bottled Water

Drinking Water Research Foundation, 1999

In February 1999, the Natural Resources Defense Council (NRDC) issued a report entitled “Bottled Water: Pure Drink or Pure Hype?” in which numerous wrong allegations against bottled water are raised. This document provides an extensive analysis and rebuttal of NRDC’s conclusions, highlighting the various mistakes and wrong allegations made by NRDC.