Exploring the Efficacy of Nile Red in Microplastic Quantification: A Costaining Approach.

Stanton, T., Johnson, M., Nathanail, P., Gomes, R.L., Needham, T., Burson, A., 2019.

ABSTRACT: The presence of microplastic particles (<5 mm) in the environment has generated considerable concern across public, political, and scientific platforms. However, the diversity of microplastics that persist in the environment poses complex analytical challenges for our understanding of their prevalence. The use of the dye Nile red to quantify microplastics is increasingly common. However, its use in microplastic analysis rarely accounts for its affinity with the breadth of particles that occur in environmental samples. Here, we examine Nile red’s ability to stain a variety of microplastic particles and common natural and anthropogenic particles found in environmental samples. To better constrain microplastic estimates using Nile red, we test the coapplication of a second stain that binds to biological material, 4′,6-diamidino-2-phenylindole (DAPI). We test the potential inflation of microplastic estimates using Nile red alone by applying this costaining approach to samples of drinking water and freshwater. The use of Nile red dye alone resulted in a maximum 100% overestimation of microplastic particles. These findings are of particular significance for the public dissemination of findings from an emotive field of study.

https://doi.org/10.1021/acs.estlett.9b00499

Finished Water Storage and Quality Concerns

By Kelly A. Reynolds, MSPH, PhD

The municipal drinking-water distribution system is a complex delivery network designed to provide adequate potable water needs to entire communities. Much information has been published relative to concerns of the distribution system integrity and ability to provide safe, consistent water to consumers. Needs for infrastructure improvements, rapid response to main breaks and leaks, biofilm control and preventing intrusion events, dead legs and pressure losses, are just some of the prevalent water quality delivery issues. Less common are discussions around safe water storage prior to delivery. Although industry standards and guidelines exist, maintaining water quality over prolonged storage presents additional challenges and uncertainties for end users.

http://www.wcponline.com/2018/09/15/finished-water-storage-quality-concerns/

Hydration for Health Conference Emphasizes Vasopressin and Kidney Diseases

Armstrong L.E.

Ann Nutr Metab 2018;72(suppl 2):1–2
The Hydration for Health Scientific Conference remains unique as the world’s only annual gathering that focuses solely on the health benefits of water consumption and creates dialogues among clinicians, scientists, physiologists, dieticians, and global healthcare organizations. The July 4–5, 2017 program included speakers from Australia, Canada, France, Italy, Sweden, the United Kingdom, and the United States. Their presentations considered (a) the positive influences of water consumption on kidney diseases and urinary tract infection (UTI), (b) human neuroendocrine regulation of water and electrolytes, and (c) low daily water consumption as an epidemiologic risk factor for chronic diseases. Three speakers focused on the essential roles of vasopressin (i.e., the antidiuretic hormone) and its surrogate (copeptin) in the sensation of thirst and as a biomarker of renal diseases.

Applicability of the direct injection liquid chromatographic tandem mass spectrometric analytical approach to the sub-ng L-1 determination of perfluoro-alkyl acids in waste, surface, ground and drinking water samples

Author Full Names: Ciofi, Lorenzo; Renai, Lapo; Rossini, Daniele; Ancillotti, Claudia; Falai, Alida; Fibbi, Donatella; Bruzzoniti, Maria Concetta; Juan Santana-Rodriguez, Jose; Orlandini, Serena; Del Bubba, Massimo
Source: TALANTA, 176 412-421; 10.1016/j.talanta.2017.08.052JAN 1 2018
Language: English

Abstract: The applicability of a direct injection UHPLC-MS/MS method for the analysis of several perfluoroalkyl acids (PFAAs) in a wide range of water matrices was investigated. The method is based on the direct injection of 100 mu L of centrifuged water sample, without any other sample treatment. Very good method detection limits (0.014-0.44 mu g L-1) and excellent intra and inter-day precision (RSD% values in the range 1.8-4.4% and 2.7-5.7%, respectively) were achieved, with a total analysis time of 20 min per sample. A high number of samples i.e. 8 drinking waters (DW), 12 ground waters (GW), 13 surface waters (SW), 8 influents and 11 effluents of wastewater treatment plants (WWTPIN and WWTPOUT) were processed and the extent of matrix effect (ME) was calculated, highlighting the strong prevalence of vertical bar ME vertical bar < 20%. The occurrence of vertical bar ME vertical bar > 50% was occasionally observed only for perfluorooctanesulphonic and perfluorodecanoic acids. Linear discriminant analysis highlighted the great contribution of the sample origin (i.e. DW, GW, SW, WWTPIN and WWTPOUT) to the ME. Partial least square regression (PLS) and leave-one-out cross-validation were performed in order to interpret and predict the signal suppression or enhancement phenomena as a function of physicochemical parameters of water samples (i.e. conductivity, hardness and chemical oxygen demand) and background chromatographic area. The PLS approach resulted only in an approximate screening, due to the low prediction power of the PLS models. However, for most analytes in most samples, the fitted and cross-validated values were such as to correctly distinguish between vertical bar ME vertical bar higher than 20% or below this limit. PFAAs in the aforementioned water samples were quantified by means of the standard addition method, highlighting their occurrence mainly in WWTP influents and effluents, at concentrations as high as one hundred of mu g L-1.

Climate change-induced increases in precipitation are reducing the potential for solar ultraviolet radiation to inactivate pathogens in surface waters

Author Full Names: Williamson, Craig E.; Madronich, Sasha; Lal, Aparna; Zepp, Richard G.; Lucas, Robyn M.; Overholt, Erin P.; Rose, Kevin C.; Schladow, S. Geoffrey; Lee-Taylor, Julia
Source: SCIENTIFIC REPORTS, 7 10.1038/s41598-017-13392-2OCT 12 2017
Language: English

Abstract: Climate change is accelerating the release of dissolved organic matter (DOM) to inland and coastal waters through increases in precipitation, thawing of permafrost, and changes in vegetation. Our modeling approach suggests that the selective absorption of ultraviolet radiation (UV) by DOM decreases the valuable ecosystem service wherein sunlight inactivates waterborne pathogens. Here we highlight the sensitivity of waterborne pathogens of humans and wildlife to solar UV, and use the DNA action spectrum to model how differences in water transparency and incident sunlight alter the ability of UV to inactivate waterborne pathogens. A case study demonstrates how heavy precipitation events can reduce the solar inactivation potential in Lake Michigan, which provides drinking water to over 10 million people. These data suggest that widespread increases in DOM and consequent browning of surface waters reduce the potential for solar UV inactivation of pathogens, and increase exposure to infectious diseases in humans and wildlife.

Occurrence of illicit drugs in water and wastewater and their removal during wastewater treatment.

Author(s): Meena K. Yadav; Short, M. D.; Rupak Aryal; Gerber, C.; Akker, B. van der; Saint, C. P.
Source: Water Research, 124 713-727; 10.1016/j.watres.2017.07.0682017

Abstract: This review critically evaluates the types and concentrations of key illicit drugs (cocaine, amphetamines, cannabinoids, opioids and their metabolites) found in wastewater, surface water and drinking water sources worldwide and what is known on the effectiveness of wastewater treatment in removing such compounds. It is also important to amass information on the trends in specific drug use as well as the sources of such compounds that enter the environment and we review current international knowledge on this. There are regional differences in the types and quantities of illicit drug consumption and this is reflected in the quantities detected in water. Generally, the levels of illicit drugs in wastewater effluents are lower than in raw influent, indicating that the majority of compounds can be at least partially removed by conventional treatment processes such as activated sludge or trickling filters. However, the literature also indicates that it is too simplistic to assume non-detection equates to drug removal and/or mitigation of associated risks, as there is evidence that some compounds may avoid detection via inadequate sampling and/or analysis protocols, or through conversion to transformation products. Partitioning of drugs from the water to the solids fraction (sludge/biosolids) may also simply shift the potential risk burden to a different environmental compartment and the review found no information on drug stability and persistence in biosolids. Generally speaking, activated sludge-type processes appear to offer better removal efficacy across a range of substances, but the lack of detail in many studies makes it difficult to comment on the most effective process configurations and operations. There is also a paucity of information on the removal effectiveness of alternative treatment processes. Research is also required on natural removal processes in both water and sediments that may over time facilitate further removal of these compounds in receiving environments.

Models for estimation of the presence of non-regulated disinfection by-products in small drinking water systems

Author Full Names: Guilherme, Stephanie; Rodriguez, Manuel J.
Source:ENVIRONMENTAL MONITORING AND ASSESSMENT, 189 (11):10.1007/s10661-017-6296-5NOV 2017
Language:English

Abstract: Among all the organic disinfection by-products (DBPs), only trihalomethanes (THMs) and haloacetic acids (HAAs) are regulated in drinking water, while most DBPs are not. Very little information exists on the occurrence of non-regulated DBPs, particularly in small water systems (SWS). Paradoxically, SWS are more vulnerable to DBPs because of a low capacity to implement adequate treatment technologies to remove DBP precursors. Since DBP analyses are expensive, usually SWS have difficulties to implement a rigorous characterization of these contaminants. The purpose of this study was to estimate non-regulated DBP levels in SWS from easy measurements of relevant parameters regularly monitored. Since no information on non-regulated DBPs in SWS was available, a sampling program was carried out in 25 SWS in two provinces of Canada. Five DBP families were investigated: THMs, HAAs, haloacetonitriles (HANs), halonitromethanes (HNMs), and haloketones (HKs). Multivariate linear mixed regression models were developed to estimate HAN, HK, and HNM levels from water quality characteristics in the water treatment plant, concentrations of regulated DBPs, and residual disinfectant levels. The models obtained have a good explanatory capacity since R-2 varies from 0.77 to 0.91 according to compounds and conditions for application (season and type of treatment). Model validation with an independent database suggested their ability for generalization in similar SWS in North America.

Children’s Lead Exposure: A Multimedia Modeling Analysis to Guide Public Health Decision-Making

Author Full Names: Zartarian, Valerie; Xue, Jianping; Tornero-Velez, Rogelio; Brown, James
Source:ENVIRONMENTAL HEALTH PERSPECTIVES, 125 (9):10.1289/EHP1605SEP 2017
Language:English

Abstract: BACKGROUND: Drinking water and other sources for lead are the subject of public health concerns around the Flint, Michigan, drinking water and East Chicago, Indiana, lead in soil crises. In 2015, the U.S. Environmental Protection Agency (EPA)’s National Drinking Water Advisory Council (NDWAC) recommended establishment of a “health-based, household action level” for lead in drinking water based on children’s exposure.

OBJECTIVES: The primary objective was to develop a coupled exposure-dose modeling approach that can be used to determine what drinking water lead concentrations keep children’s blood lead levels (BLLs) below specified values, considering exposures from water, soil, dust, food, and air. Related objectives were to evaluate the coupled model estimates using real-world blood lead data, to quantify relative contributions by the various media, and to identify key model inputs.

METHODS: A modeling approach using the EPA’s Stochastic Human Exposure and Dose Simulation (SHEDS)-Multimedia and Integrated Exposure Uptake and Biokinetic (IEUBK) models was developed using available data. This analysis for the U.S. population of young children probabilistically simulated multimedia exposures and estimated relative contributions of media to BLLs across all population percentiles for several age groups.

RESULTS: Modeled BLLs compared well with nationally representative BLLs (0-23% relative error). Analyses revealed relative importance of soil and dust ingestion exposure pathways and associated Pb intake rates; water ingestion was also a main pathway, especially for infants.

CONCLUSIONS: This methodology advances scientific understanding of the relationship between lead concentrations in drinking water and BLLs in children. It can guide national health-based benchmarks for lead and related community public health decisions. https://doi.org/10.1289/EHP1605.

Strategies to Improve Private-Well Water Quality: A North Carolina Perspective

Author Full Names: Gibson, Jacqueline MacDonald; Pieper, Kelsey J.
Source:ENVIRONMENTAL HEALTH PERSPECTIVES, 125 (7):10.1289/EHP890JUL 2017
Language:English

Abstract: BACKGROUND: Evidence suggests that the 44.5 million U.S. residents drawing their drinking water from private wells face higher risks of waterborne contaminant exposure than those served by regulated community water supplies. Among U.S. states, North Carolina (N.C.) has the second-largest population relying on private wells, making it a useful microcosm to study challenges to maintaining private-well water quality.

OBJECTIVES: This paper summarizes recommendations from a two-day summit to identify options to improve drinking-water quality for N.C. residents served by private wells. METHODS: The Research Triangle Environmental Health Collaborative invited 111 participants with knowledge of private-well water challenges to attend the Summit. Participants worked in small groups that focused on specific aspects and reconvened in plenary sessions to formulate consensus recommendations.

DISCUSSION: Summit participants highlighted four main barriers to ensuring safe water for residents currently relying on private wells: (1) a database of private well locations is unavailable; (2) racial disparities have perpetuated reliance on private wells in some urbanized areas; (3) many private well users lack information or resources to monitor and maintain their wells; and (4) private-well support programs are fragmented and lack sufficient resources. The Summit produced 10 consensus recommendations for ways to overcome these barriers.

CONCLUSIONS: The Summit recommendations, if undertaken, could improve the health of North Carolinians facing elevated risks of exposure to waterborne contaminants because of their reliance on inadequately monitored and maintained private wells. Because many of the challenges in N.C. are common nationwide, these recommendations could serve as models for other states.

Still Treating Lead Poisoning After All These Years

Twenty-five years ago, in a commentary published in Pediatrics, Drs Needleman and Jackson1 asked whether we would still be treating lead poisoning in the 21st century. Unfortunately, despite considerable progress, our public health system is still failing to prevent children from being lead poisoned and the specter of lead poisoning continues to cast a shadow over the country: over 500 000 American children have a blood lead level of >5 μg/dL (>50 ppb); 23 million homes have 1 or more lead hazards; an unknown number of Americans drink water from lead service lines; and federal standards for lead in house dust, soil, and water fail to protect children. We have understandably focused on the plight of children in Flint, Michigan, but children in hundreds of other cities have blood lead levels higher than the children of Flint.

 

http://pediatrics.aappublications.org/content/early/2017/07/14/peds.2017-1400

Surveillance for Waterborne Disease Outbreaks Associated with Drinking Water — United States, 2013–2014

Benedict KM, Reses H, Vigar M, et al. Surveillance for Waterborne Disease Outbreaks Associated with Drinking Water — United States, 2013–2014. MMWR Morb Mortal Wkly Rep 2017;66:1216–1221. DOI: http://dx.doi.org/10.15585/mmwr.mm6644a3

Provision of safe water in the United States is vital to protecting public health (1). Public health agencies in the U.S. states and territories* report information on waterborne disease outbreaks to CDC through the National Outbreak Reporting System (NORS) (https://www.cdc.gov/healthywater/surveillance/index.html). During 2013–2014, 42 drinking water–associated outbreaks were reported, accounting for at least 1,006 cases of illness, 124 hospitalizations, and 13 deaths. Legionella was associated with 57% of these outbreaks and all of the deaths. Sixty-nine percent of the reported illnesses occurred in four outbreaks in which the etiology was determined to be either a chemical or toxin or the parasite Cryptosporidium. Drinking water contamination events can cause disruptions in water service, large impacts on public health, and persistent community concern about drinking water quality. Effective water treatment and regulations can protect public drinking water supplies in the United States, and rapid detection, identification of the cause, and response to illness reports can reduce the transmission of infectious pathogens and harmful chemicals and toxins.

 

Energy and sports drinks in children and adolescents

Principal author(s)

Catherine M Pound, Becky Blair; Canadian Paediatric Society, Nutrition and Gastroenterology Committee

Abstract

Sports drinks and caffeinated energy drinks (CEDs) are commonly consumed by youth. Both sports drinks and CEDs pose potential risks for the health of children and adolescents and may contribute to obesity. Sports drinks are generally unnecessary for children engaged in routine or play-based physical activity. CEDs may affect children and adolescents more than adults because they weigh less and thus experience greater exposure to stimulant ingredients per kilogram of body weight. Paediatricians need to recognize and educate patients and families on the differences between sport drinks and CEDs. Screening for the consumption of CEDs, especially when mixed with alcohol, should be done routinely. The combination of CEDs and alcohol may be a marker for higher risk of substance use or abuse and for other health-compromising behaviours.

Keywords: AlcoholCaffeineCEDsEnergy drinksSports drinks

Hydration status of community-dwelling seniors.

Abstract: Dehydration is the most common fluid or electrolyte disorder among older persons. This study was designed to examine the hydration status of community-dwelling seniors.

Hydration in the Aging

Conclusion: With aging, body water stores decrease, thirst sensation is disturbed and kidneys are less able to concentrate urine, putting the elderly at increased risk of dehydration.  More…

Evidence-based nutrition education:Elderly hydration issues

Dehydration is a serious, yet modifiable clinical problem for older adults.
The purpose of this study was to evaluate and compare the impact of two hydration
education methods, brochure and individual counseling, on clinical, cognitive, behavioral,…

Oral Hydration in Older Adults: Greater awareness is needed in preventing, recognizing, and treating dehydration.

OVERVIEW: Maintaining adequate fluid balance is an essential component of health at every stage of life. Age-related changes make older adults more vulnerable to shifts in water balance that can result in over-hydration or, more frequently, dehydration. This article reviews age-related changes, risk factors, assessment measures, and nursing interventions for dehydration.

 

Significant racial, ethnic, income disparities in hydration found among U.S. adults

Nearly a third of U.S. adults are not hydrated enough, and poorer adults as well as Black and Hispanic adults are at higher risk for poor hydration than wealthier and white adults, according to a new study from Harvard T.H. Chan School of Public Health.

Lack of access to clean, safe drinking water—as highlighted by recent water crises in communities such as Flint, Michigan—may be one of the main reasons for the disparities, the authors suggested.

The study appeared online July 20, 2017 in the American Journal of Public Health.

Effect of increased water intake on plasma copeptin in healthy adults

Published online: June 3 2017 European Journal of Nutrition (IF 3.239).  Guillaume Lemetais · Olle Melander · Mariacristina Vecchio · Jeanne H. Bottin ·
Sofa Enhörning · Erica T. Perrier

Abstract

PURPOSE:

Inter-individual variation in median plasma copeptin is associated with incident type 2 diabetes mellitus, progression of chronic kidney disease, and cardiovascular events. In this study, we examined whether 24-h urine osmolality was associated with plasma copeptin and whether increasing daily water intake could impact circulating plasma copeptin.

METHODS:

This trial was a prospective study conducted at a single investigating center. Eighty-two healthy adults (age 23.6 ± 2.9 years, BMI 22.2 ± 1.5 kg/m2, 50% female) were stratified based upon habitual daily fluid intake volumes: arm A (50-80% of EFSA dietary reference values), arm B (81-120%), and arm C (121-200%). Following a baseline visit, arms A and B increased their drinking water intake to match arm C for a period of 6 consecutive weeks.

RESULTS:

At baseline, plasma copeptin was positively and significantly associated with 24-h urine osmolality (p = 0.002) and 24-h urine specific gravity (p = 0.003) but not with plasma osmolality (p = 0.18), 24-h urine creatinine (p = 0.09), and total fluid intake (p = 0.52). Over the 6-week follow-up, copeptin decreased significantly from 5.18 (3.3;7.4) to 3.90 (2.7;5.7) pmol/L (p = 0.012), while urine osmolality and urine specific gravity decreased from 591 ± 206 to 364 ± 117 mOsm/kg (p < 0.001) and from 1.016 ± 0.005 to 1.010 ± 0.004 (p < 0.001), respectively.

CONCLUSIONS:

At baseline, circulating levels of copeptin were positively associated with 24-h urine concentration in healthy young subjects with various fluid intakes. Moreover, this study shows, for the first time, that increased water intake over 6 weeks results in an attenuation of circulating copeptin.

CLINICAL TRIAL REGISTRATION NUMBER:

NCT02044679.

KEYWORDS:

Copeptin; Fluid intake; Hydration; Urine osmolality; Water intake; drinking water

Effect of increased water intake on plasma copeptin in healthy adults

 

Effective Immediately: Healthcare Facilities Required to Reduce Legionellosis Risks from Tap Water

Published July 2017

By Kelly A. Reynolds, MSPH, PhD

If you follow On Tap frequently, you know that the bacterium, Legionella, has been a repeated topic in recent years. Once again, Legionella is at the forefront of discussions due to continuing waterborne outbreaks and new directives in healthcare facilities for prevention. On June 2, the Department of Health and Human Services, Centers for Medicare and Medicaid Services (CMS) issued a memo that will undoubtedly expand the awareness of Legionella risks and further drive the implementation of preventative approaches.

The Unintended Consequences of Changes in Beverage Options and the Removal of Bottled Water on a University Campus

Elizabeth R. Berman BS, and Rachel K. Johnson PhD, MPH, RD –

Accepted: January 16, 2015
Published Online: June 05, 2015

Objectives. We investigated how the removal of bottled water along with a minimum healthy beverage requirement affected the purchasing behavior, healthiness of beverage choices, and consumption of calories and added sugars of university campus consumers.

Methods. With shipment data as a proxy, we estimated bottled beverage consumption over 3 consecutive semesters: baseline (spring 2012), when a 30% healthy beverage ratio was enacted (fall 2012), and when bottled water was removed (spring 2013) at the University of Vermont. We assessed changes in number and type of beverages and per capita calories, total sugars, and added sugars shipped.

Results. Per capita shipments of bottles, calories, sugars, and added sugars increased significantly when bottled water was removed. Shipments of healthy beverages declined significantly, whereas shipments of less healthy beverages increased significantly. As bottled water sales dropped to zero, sales of sugar-free beverages and sugar-sweetened beverages increased.

Conclusions. The bottled water ban did not reduce the number of bottles entering the waste stream from the university campus, the ultimate goal of the ban. With the removal of bottled water, consumers increased their consumption of less healthy bottled beverages.

Characterizing pharmaceutical, personal care product, and hormone contamination in a karst aquifer of southwestern Illinois, USA, using water quality and stream flow parameters

Dodgen, L.K., et.al., Science of the Total Environment, 578:281-289, February 2017

Karst aquifers are drinking water sources for 25% of the global population. However, the unique geology of karst areas facilitates rapid transfer of surficial chemicals to groundwater, potentially contaminating drinking water. Contamination of karst aquifers by nitrate, chloride, and bacteria have been previously observed, but little knowledge is available on the presence of contaminants of emerging concern (CECs), such as pharmaceuticals. Over a 17-month period, 58 water samples were collected from 13 sites in the Salem Plateau, a karst region in southwestern Illinois, United States. Water was analyzed for 12 pharmaceutical and personal care products (PPCPs), 7 natural and synthetic hormones, and 49 typical water quality parameters (e.g., nutrients and bacteria). Hormones were detected in only 23% of samples, with concentrations of 2.2–9.1 ng/L. In contrast, PPCPs were quantified in 89% of groundwater samples. The two most commonly detected PPCPs were the antimicrobial triclocarban, in 81% of samples, and the cardiovascular drug gemfibrozil, in 57%. Analytical results were combined with data of local stream flow, weather, and land use to 1) characterize the extent of aquifer contamination by CECs, 2) cluster sites with similar PPCP contamination profiles, and 3) develop models to describe PPCP contamination. Median detection in karst groundwater was 3 PPCPs at a summed concentration of 4.6 ng/L. Sites clustered into 3 subsets with unique contamination models. PPCP contamination in Cluster I sites was related to stream height, manganese, boron, and heterotrophic bacteria. Cluster II sites were characterized by groundwater temperature, specific conductivity, sodium, and calcium. Cluster III sites were characterized by dissolved oxygen and barium. Across all sites, no single or small set of water quality factors was significantly predictive of PPCP contamination, although gemfibrozil concentrations were strongly related to the sum of PPCPs in karst groundwater.

Determination of dimethyl selenide and dimethyl sulphide compounds causing off-flavours in bottled mineral waters

Guadayol, M., et. al., Water Research, 92 149-155; April 2016

Sales of bottled drinking water have shown a large growth during the last two decades due to the general belief that this kind of water is healthier, its flavour is better and its consumption risk is lower than that of tap water. Due to the previous points, consumers are more demanding with bottled mineral water, especially when dealing with its organoleptic properties, like taste and odour. This work studies the compounds that can generate obnoxious smells, and that consumers have described like swampy, rotten eggs, sulphurous, cooked vegetable or cabbage. Closed loop stripping analysis (CLSA) has been used as a pre-concentration method for the analysis of off-flavour compounds in water followed by identification and quantification by means of GC-MS. Several bottled water with the aforementioned smells showed the presence of volatile dimethyl selenides and dimethyl sulphides, whose concentrations ranged, respectively, from 4 to 20 ng/L and from 1 to 63 ng/L. The low odour threshold concentrations (OTCs) of both organic selenide and sulphide derivatives prove that several objectionable odours in bottled waters arise from them. Microbial loads inherent to water sources, along with some critical conditions in water processing, could contribute to the formation of these compounds. There are few studies about volatile organic compounds in bottled drinking water and, at the best of our knowledge, this is the first study reporting the presence of dimethyl selenides and dimethyl sulphides causing odour problems in bottled waters.

Multimedia exposures to arsenic and lead for children near an inactive mine tailings and smelter site

Loh, M.M., et.al., Environmental Research, Volume 146, p.331–339, April 2016

Children living near contaminated mining waste areas may have high exposures to metals from the environment. This study investigates whether exposure to arsenic and lead is higher in children in a community near a legacy mine and smelter site in Arizona compared to children in other parts of the United States and the relationship of that exposure to the site. Arsenic and lead were measured in residential soil, house dust, tap water, urine, and toenail samples from 70 children in 34 households up to 7 miles from the site. Soil and house dust were sieved, digested, and analyzed via ICP-MS. Tap water and urine were analyzed without digestion, while toenails were washed, digested and analyzed. Blood lead was analyzed by an independent, certified laboratory. Spearman correlation coefficients were calculated between each environmental media and urine and toenails for arsenic and lead. Geometric mean arsenic (standard deviation) concentrations for each matrix were: 22.1 (2.59) ppm and 12.4 (2.27) ppm for soil and house dust.

Contrasting regional and national mechanisms for predicting elevated arsenic in private wells across the United States using classification and regression trees

Frederick, L., et. al., Water Research, March 2016

Arsenic contamination in groundwater is a public health and environmental concern in the United States (U.S.) particularly where monitoring is not required under the Safe Water Drinking Act. Previous studies suggest the influence of regional mechanisms for arsenic mobilization into groundwater; however, no study has examined how influencing parameters change at a continental scale spanning multiple regions. We herein examine covariates for groundwater in the western, central and eastern U.S. regions representing mechanisms associated with arsenic concentrations exceeding the U.S. Environmental Protection Agency maximum contamination level (MCL) of 10 parts per billion (ppb). Statistically significant covariates were identified via classification and regression tree (CART) analysis, and included hydrometeorological and groundwater chemical parameters. The CART analyses were performed at two scales: national and regional; for which three physiographic regions located in the western (Payette Section and the Snake River Plain), central (Osage Plains of the Central Lowlands), and eastern (Embayed Section of the Coastal Plains) U.S. were examined. Validity of each of the three regional CART models was indicated by values >85% for the area under the receiver-operating characteristic curve. Aridity (precipitation minus potential evapotranspiration) was identified as the primary covariate associated with elevated arsenic at the national scale. At the regional scale, aridity and pH were the major covariates in the arid to semi-arid (western) region; whereas dissolved iron (taken to represent chemically reducing conditions) and pH were major covariates in the temperate (eastern) region, although additional important covariates emerged, including elevated phosphate. Analysis in the central U.S. region indicated that elevated arsenic concentrations were driven by a mixture of those observed in the western and eastern regions.

Effect of a School-Based Water Intervention on Child Body Mass Index and Obesity. – PubMed – NCBI

Schwartz, A.E., et al., JAMA Pediatrics, March 2016

We want to decrease the amount of caloric beverages consumed while simultaneously increasing water consumption in order to promote child health and decrease the prevalence of childhood obesity. This aim of this study is to estimate the impact of water jets (electrically cooled, large clear jugs with a push lever for fast dispensing) on standardized body mass index, overweight, and obesity in elementary school and middle school students. Milk purchases were explored as a potential mechanism for weight outcomes. This quasi-experimental study used a school-level database of cafeteria equipment deliveries between the 2008-2009 and 2012-2013 and included a sample of 1227 New York, New York, public elementary schools and middle schools and the 1 065 562 students within those schools. The intervention was installation of water jets in schools. Individual body mass index (BMI) was calculated for all students in the sample using annual student-level height and weight measurements collected as part of New York’s FITNESSGRAM initiative. Age- and sex-specific growth charts produced by the Centers for Disease Control and Prevention were used to categorize students as overweight and obese. The hypothesis that water jets would be associated with decreased standardized BMI, overweight, and obesity was tested using a difference-in-difference strategy, comparing outcomes for treated and nontreated students before and after the introduction of a water jet. This study included 1 065 562 students within New York City public elementary schools and middle schools. There was a significant effect of water jets on standardized BMI, such that the adoption of water jets was associated with a 0.025 (95% CI, -0.038 to -0.011) reduction of standardized BMI for boys and a 0.022 (95% CI, -0.035 to -0.008) reduction of standardized BMI for girls (P < .01). There was also a significant effect on being overweight. Water jets were associated with a 0.9 percentage point reduction (95% CI, 0.015-0.003) in the likelihood of being overweight for boys and a 0.6 percentage reduction (95% CI, 0.011-0.000) in the likelihood of being overweight for girls (P < .05). We also found a 12.3 decrease (95% CI, -19.371 to -5.204) in the number of all types of milk half-pints purchased per student per year (P < .01). Results from this study show an association between a relatively low-cost water availability intervention and decreased student weight. Milk purchases were explored as a potential mechanism. Additional research is needed to examine potential mechanisms for decreased student weight, including reduced milk taking, as well as assessing impacts on longer-term outcomes.

EHP – Use of a Cumulative Exposure Index to Estimate the Impact of Tap Water Lead Concentration on Blood Lead Levels in 1- to 5-Year-Old Children (Montréal, Canada)

Ngueta, G., Environmental Health Perspective, March 2016

Drinking water is recognized as a source of lead (Pb) exposure. However, questions remain about the impact of chronic exposure to lead-contaminated water on internal dose. Our goal was to estimate the relation between a cumulative water Pb exposure index (CWLEI) and blood Pb levels (BPb) in children 1–5 years of ages. Between 10 September 2009 and 27 March 2010, individual characteristics and water consumption data were obtained from 298 children. Venous blood samples were collected (one per child) and a total of five 1-L samples of water per home were drawn from the kitchen tap. A second round of water collection was performed between 22 June 2011 and 6 September 2011 on a subsample of houses. Pb analyses used inductively coupled plasma mass spectroscopy. Multiple linear regressions were used to estimate the association between CWLEI and BPb. Each 1-unit increase in CWLEI multiplies the expected value of BPb by 1.10 (95% CI: 1.06, 1.15) after adjustment for confounders. Mean BPb was significantly higher in children in the upper third and fourth quartiles of CWLEI (0.7–1.9 and ≥ 1.9 μg/kg of body weight) compared with the first (< 0.2 μg/kg) after adjusting for confounders (19%; 95% CI: 0, 42% and 39%; 95% CI: 15, 67%, respectively). The trends analysis yielded a p-value < 0.0001 after adjusting for confounders suggesting a dose–response relationship between percentiles of CWLEI and BPb. In children 1–5 years of age, BPb was significantly associated with water lead concentration with an increase starting at a cumulative lead exposure of ≥ 0.7 μg Pb/kg of body weight. In this age group, an increase of 1 μg/L in water lead would result in an increase of 35% of BPb after 150 days of exposure.

 

Elevated Blood Lead Levels in Children Associated With the Flint Drinking Water Crisis: A Spatial Analysis of Risk and Public Health Response

Hanna-Attisha, M., et. al., American Journal of Public Health, Vol. 106 no. 2, February 2016

We analyzed differences in pediatric elevated blood lead level incidence before and after Flint, Michigan, introduced a more corrosive water source into an aging water system without adequate corrosion control. We reviewed blood lead levels for children younger than 5 years before (2013) and after (2015) water source change in Greater Flint, Michigan. We assessed the percentage of elevated blood lead levels in both time periods, and identified geographical locations through spatial analysis. Incidence of elevated blood lead levels increased from 2.4% to 4.9% ( P< .05) after water source change, and neighborhoods with the highest water lead levels experienced a 6.6% increase. No significant change was seen outside the city. Geospatial analysis identified disadvantaged neighborhoods as having the greatest elevated blood lead level increases and informed response prioritization during the now-declared public health emergency. It was concluded that the percentage of children with elevated blood lead levels increased after water source change, particularly in socioeconomically disadvantaged neighborhoods. Water is a growing source of childhood lead exposure because of aging infrastructure.

Plain water consumption in relation to energy intake and diet quality among US adults, 2005–2012 – An – 2016 – Journal of Human Nutrition and Dietetics – Wiley Online Library

An, R. and McCaffrey, J., Journal of Human Nutrition and Dietetics, February 2016

The present study examined plain water consumption in relation to energy intake and diet quality among US adults. A nationally representative sample of 18 311 adults aged ≥18 years, from the National Health and Nutrition Examination Survey 2005–2012, was analysed. The first-difference estimator approach addressed confounding bias from time-invariant unobservables (e.g. eating habits, taste preferences) by using within-individual variations in diet and plain water consumption between two nonconsecutive 24-h dietary recalls. One percentage point increase in the proportion of daily plain water in total dietary water consumption was associated with a reduction in mean (95% confidence interval) daily total energy intake of 8.58 (7.87–9.29) kcal, energy intake from sugar-sweetened beverages of 1.43 (1.27–1.59) kcal, energy intake from discretionary foods of 0.88 (0.44–1.32) kcal, total fat intake of 0.21 (0.17–0.25) g, saturated fat intake of 0.07 (0.06–0.09) g, sugar intake of 0.74 (0.67–0.82) g, sodium intake of 9.80 (8.20–11.39) mg and cholesterol intake of 0.88 (0.64–1.13) g. The effects of plain water intake on diet were similar across race/ethnicity, education attainment, income level and body weight status, whereas they were larger among males and young/middle-aged adults than among females and older adults, respectively. Daily overall diet quality measured by the Healthy Eating Index-2010 was not found to be associated with the proportion of daily plain water in total dietary water consumption. It was concluded that promoting plain water intake could be a useful public health strategy for reducing energy and targeted nutrient consumption in US adults, which warrants confirmation in future controlled interventions.

Human exposure to thallium through tap water: A study from Valdicastello Carducci and Pietrasanta (northern Tuscany, Italy)

Campanella, B., et. al., Science of the Total Environment, January 2016

A geological study evidenced the presence of thallium (Tl) at concentrations of concern in groundwaters near Valdicastello Carducci (Tuscany, Italy). The source of contamination has been identified in the Tl-bearing pyrite ores occurring in the abandoned mining sites of the area. The strongly acidic internal waters flowing in the mining tunnels can reach exceptional Tl concentrations, up to 9000μg/L. In September 2014 Tl contamination was also found in the tap water distributed in the same area (from 2 to 10μg/L). On October 3, 2014 the local authorities imposed a Do Not Drink order to the population. Here we report the results of the exposure study carried out from October 2014 to October 2015, and aimed at quantifying Tl levels in 150 urine and 318 hair samples from the population of Valdicastello Carducci and Pietrasanta. Thallium was quantified by inductively coupled plasma – mass spectrometry (ICP-MS). Urine and hair were chosen as model matrices indicative of different time periods of exposure (short-term and long-term, respectively). Thallium values found in biological samples were correlated with Tl concentrations found in tap water in the living area of each citizen, and with his/her habits. Thallium concentration range found in hair and urine was 1-498ng/g (values in unexposed subjects 0.1-6ng/g) and 0.046-5.44μg/L (reference value for the European population 0.006μg/L), respectively. Results show that Tl levels in biological samples were significantly associated with residency in zones containing elevated water Tl levels. The kinetics of decay of Tl concentration in urine samples was also investigated. At the best of our knowledge, this is the first study on human contamination by Tl through water involving such a high number of samples.

Qualitative analysis of water quality deterioration and infection by Helicobacter pylori in a community with high risk of stomach cancer (Cauca, Colombia)

Acosta, C.P., et. al., Salud Colectiva, 11 (4):575-590, December 2015

This study looks at aspects of the environmental health of the rural population in Timbio (Cauca, Columbia) in relation to the deterioration of water quality. The information was obtained through participatory research methods exploring the management and use of water, the sources of pollution and the perception of water quality and its relation to Helicobacter pylori infection. The results are part of the qualitative analysis of a first research phase characterizing water and sanitation problems and their relation to emerging infectious diseases as well as possible solutions, which was carried out between November 2013 and August 2014. The results of this research are discussed from an ecosystemic approach to human health, recognizing the complexity of environmental conflicts related to water resources and their impacts on the health of populations. Through the methodology used, it is possible to detect and visualize the most urgent problems as well as frequent causes of contamination of water resources so as to propose solutions within a joint agenda of multiple social actors.

Fountain Autopsy to Determine Lead Occurrence in Drinking Water: Journal of Environmental Engineering: (ASCE)

McIlwain, B., et al., Journal of Environmental Engineering, November 2015

Exposure to lead in drinking water poses a risk for various adverse health effects, and significant efforts have been made to monitor and eliminate lead exposure in drinking water. This study focused on the localization of lead exposure from 71 drinking water fountains in nonresidential buildings in order to determine the source of elevated lead and understand the effects of fountains associated with lead concentration in drinking water. Drinking water fountains containing lead-lined cooling tanks and brass fittings were found to release lead concentrations in excess of 10 μg/L10 μg/L, and fountains with low or infrequent usage and those with cooling tanks produced the highest concentrations (in excess of 20  μg/L20  μg/L) of lead. One particular fountain model found at several locations throughout the institution was associated with some of the highest lead concentrations measured throughout the study. This fountain was recalled in the United States, but not in Canada. This article adds to existing research demonstrating that drinking water fountains are a potentially significant and underappreciated source of lead exposure in nonresidential buildings.

PLOS ONE: A Systematic Review of Waterborne Disease Outbreaks Associated with Small Non-Community Drinking Water Systems in Canada and the United States

Pons, W., et al., PLoS ONE, October 2015

Reports of outbreaks in Canada and the United States (U.S.) indicate that approximately 50% of all waterborne diseases occur in small non-community drinking water systems (SDWSs). Summarizing these investigations to identify the factors and conditions contributing to outbreaks is needed in order to help prevent future outbreaks. The objectives of this study were to: 1) identify published reports of waterborne disease outbreaks involving SDWSs in Canada and the U.S. since 1970; 2) summarize reported factors contributing to outbreaks, including water system characteristics and events surrounding the outbreaks; and 3) identify terminology used to describe SDWSs in outbreak reports. Three electronic databases and grey literature sources were searched for outbreak reports involving SDWSs throughout Canada and the U.S. from 1970 to 2014. Two reviewers independently screened and extracted data related to water system characteristics and outbreak events. The data were analyzed descriptively with ‘outbreak’ as the unit of analysis. From a total of 1,995 citations, we identified 50 relevant articles reporting 293 unique outbreaks. Failure of an existing water treatment system (22.7%) and lack of water treatment (20.2%) were the leading causes of waterborne outbreaks in SDWSs. A seasonal trend was observed with 51% of outbreaks occurring in summer months (p<0.001). There was large variation in terminology used to describe SDWSs, and a large number of variables were not reported, including water source and whether water treatment was used (missing in 31% and 66% of reports, respectively). More consistent reporting and descriptions of SDWSs in future outbreak reports are needed to understand the epidemiology of these outbreaks and to inform the development of targeted interventions for SDWSs. Additional monitoring of water systems that are used on a seasonal or infrequent basis would be worthwhile to inform future protection efforts.

Microbial Health Risks of Regulated Drinking Waters in the United States — A Comparative Microbial Safety Assessment of Public Water Supplies and Bottled Water

Edberg, S.C., Topics in Public Health, June 2015

The quality of drinking water in the United States (U.S.) is extensively monitored and regulated by federal, state and local agencies, yet there is increasing public concern and confusion about the safety and quality of drinking water –– both from public water systems and from bottled water products. In the U.S., tap water and bottled water are regulated by two different agencies: the Environmental Protection Agency (EPA) regulates public water system water (tap water) and the Food and Drug Administration (FDA) regulates bottled water. Federal law requires that the FDA’s regulations for bottled water must be at least as protective of public health as EPA standards for tap water.

Emerging Trends in Groundwater Pollution and Quality

Kurwadkar, S., Water Environment Research, pp. 1677-1691(15), October 2014

Groundwater pollution due to anthropogenic activities may impact overall groundwater quality. Organic and inorganic pollutants have been routinely detected at unsafe levels in groundwater rendering this important drinking water resource practically unusable. Vulnerability of groundwater pollution and subsequent impact has been documented in various studies across the globe. Field studies as well as mathematical models have demonstrated increasing levels of pollutants in both shallow and deep aquifer systems. New emerging pollutants such as organic micro-pollutants have also been detected in some industrialized as well as in developing countries. Increased vulnerability coupled with ever growing demand for groundwater may pose a greater threat of pollution due to induced recharge and lack of environmental safeguards to protect groundwater sources. In this review paper, comprehensive assessment of groundwater quality impact due to human activities such as improper management of organic and inorganic waste, and natural sources is documented. A detailed review of published reports and peer reviewed journal papers across the world clearly demonstrate that groundwater quality is declining over time. A proactive approach is needed to prevent human health and ecological consequences due to ingestion of contaminated groundwater.

Prenatal drinking-water exposure to tetrachloroethylene and ischemic placental disease: a retrospective cohort study | Environmental Health | Full Text

Carwile, J.L., et al., Environmental Health, September 2014

Prenatal drinking water exposure to tetrachloroethylene (PCE) has been previously related to intrauterine growth restriction and stillbirth. Pathophysiologic and epidemiologic evidence linking these outcomes to certain other pregnancy complications, including placental abruption, preeclampsia, and small-for-gestational-age (SGA) (i.e., ischemic placental diseases), suggests that PCE exposure may also be associated with these events. We examined whether prenatal exposure to PCE-contaminated drinking water was associated with overall or individual ischemic placental diseases. Using a retrospective cohort design, we compared 1,091 PCE-exposed and 1,019 unexposed pregnancies from 1,766 Cape Cod, Massachusetts women. Exposure between 1969 and 1990 was estimated using water distribution system modeling software. Data on birth weight and gestational age were obtained from birth certificates; mothers self-reported pregnancy complications. Of 2,110 eligible pregnancies, 9% (N = 196) were complicated by ≥1 ischemic placental disease. PCE exposure was not associated with overall ischemic placental disease (for PCE ≥ sample median vs. no exposure, risk ratio (RR): 0.90; 95% confidence interval (CI): 0.65, 1.24), preeclampsia (RR: 0.36; 95% CI: 0.12-1.07), or SGA (RR: 0.98; 95% CI: 0.66-1.45). However, pregnancies with PCE exposure ≥ the sample median had 2.38-times the risk of stillbirth ≥27 weeks gestation (95% CI: 1.01, 5.59), and 1.35-times of the risk of placental abruption (95% CI: 0.68, 2.67) relative to unexposed pregnancies. We concluded that prenatal PCE exposure was not associated with overall ischemic placental disease, but may increase risk of stillbirth.

Sugar-Sweetened Beverage Consumption Among Adults — 18 States, 2012

Kumar, G.S., et al., CDC’s Morbidity and Mortality Weekly Report, August 2014

Reducing consumption of calories from added sugars is a recommendation of the 2010 Dietary Guidelines for Americans* and an objective of Healthy People 2020.† Sugar-sweetened beverages (SSB) are major sources of added sugars in the diets of U.S. residents (1). Daily SSB consumption is associated with obesity and other chronic health conditions, including diabetes and cardiovascular disease (2). U.S. adults consumed an estimated average of 151 kcal/day of SSB during 2009–2010, with regular (i.e., nondiet) soda and fruit drinks representing the leading sources of SSB energy intake (3,4). However, there is limited information on state-specific prevalence of SSB consumption. To assess regular soda and fruit drink consumption among adults in 18 states, CDC analyzed data from the 2012 Behavioral Risk Factor Surveillance System (BRFSS). Among the 18 states surveyed, 26.3% of adults consumed regular soda or fruit drinks or both ≥1 times daily. By state, the prevalence ranged from 20.4% to 41.4%. Overall, consumption of regular soda or fruit drinks was most common among persons aged 18‒34 years (24.5% for regular soda and 16.6% for fruit drinks), men (21.0% and 12.3%), non-Hispanic blacks (20.9% and 21.9%), and Hispanics (22.6% and 18.5%). Persons who want to reduce added sugars in their diets can decrease their consumption of foods high in added sugars such as candy, certain dairy and grain desserts, sweetened cereals, regular soda, fruit drinks, sweetened tea and coffee drinks, and other SSBs. States and health departments can collaborate with worksites and other community venues to increase access to water and other healthful beverages.

Contamination of Groundwater Systems in the US and Canada by Enteric Pathogens, 1990–2013: A Review and Pooled-Analysis

Hynds, P.D., Thomas, M.K., Pintar, K.D.M., PLOS ONE,Vol. 9, issue 5, e93301, May 2014

A combined review and pooled-analysis approach was used to investigate groundwater contamination in Canada and the US from 1990 to 2013; fifty-five studies met eligibility criteria. Four study types were identified. It was found that study location affects study design, sample rate and studied pathogen category. Approximately 15% (316/2210) of samples from Canadian and US groundwater sources were positive for enteric pathogens, with no difference observed based on system type. Knowledge gaps exist, particularly in exposure assessment for attributing disease to groundwater supplies. Furthermore, there is a lack of consistency in risk factor reporting (local hydrogeology, well type, well use, etc). The widespread use of fecal indicator organisms in reported studies does not inform the assessment of human health risks associated with groundwater supplies. This review illustrates how groundwater study design and location are critical for subsequent data interpretation and use. Knowledge gaps exist related to data on bacterial, viral and protozoan pathogen prevalence in Canadian and US groundwater systems, as well as a need for standardized approaches for reporting study design and results. Fecal indicators are examined as a surrogate for health risk assessments; caution is advised in their widespread use. Study findings may be useful during suspected waterborne outbreaks linked with a groundwater supply to identify the likely etiological agent and potential transport pathway.

Drinking water biofilm cohesiveness changes under chlorination or hydrodynamic stress

Mathieu, L., et al., Water Research, May 2014

Attempts at removal of drinking water biofilms rely on various preventive and curative strategies such as nutrient reduction in drinking water, disinfection or water flushing, which have demonstrated limited efficiency. The main reason for these failures is the cohesiveness of the biofilm driven by the physico-chemical properties of its exopolymeric matrix (EPS). Effective cleaning procedures should break up the matrix and/or change the elastic properties of bacterial biofilms. The aim of this study was to evaluate the change in the cohesive strength of two-month-old drinking water biofilms under increasing hydrodynamic shear stress τw (from ∼0.2 to ∼10 Pa) and shock chlorination (applied concentration at T0: 10 mg Cl2/L; 60 min contact time). Biofilm erosion (cell loss per unit surface area) and cohesiveness (changes in the detachment shear stress and cluster volumes measured by atomic force microscopy (AFM)) were studied. When rapidly increasing the hydrodynamic constraint, biofilm removal was found to be dependent on a dual process of erosion and coalescence of the biofilm clusters. Indeed, 56% of the biofilm cells were removed with, concomitantly, a decrease in the number of the 50–300 μm3 clusters and an increase in the number of the smaller (i.e., 600 μm3) ones. Moreover, AFM evidenced the strengthening of the biofilm structure along with the doubling of the number of contact points, NC, per cluster volume unit following the hydrodynamic disturbance. This suggests that the compactness of the biofilm exopolymers increases with hydrodynamic stress. Shock chlorination removed cells (−75%) from the biofilm while reducing the volume of biofilm clusters. Oxidation stress resulted in a decrease in the cohesive strength profile of the remaining drinking water biofilms linked to a reduction in the number of contact points within the biofilm network structure in particular for the largest biofilm cluster volumes (>200 μm3). Changes in the cohesive strength of drinking water biofilms subsequent to cleaning/disinfection operations call into question the effectiveness of cleaning-in-place procedures. The combined alternating use of oxidation and shear stress sequences needs to be investigated as it could be an important adjunct to improving biofilm removal/reduction procedures.

 

Large Outbreak of Cryptosporidium hominis Infection Transmitted through the Public Water Supply, Sweden

Widerström, M., et.al., Emerging Infectious Diseases,Vol 20 No 4, April 2014

In November 2010, ≈27,000 (≈45%) inhabitants of Östersund, Sweden, were affected by a waterborne outbreak of cryptosporidiosis. The outbreak was characterized by a rapid onset and high attack rate, especially among young and middle-aged persons. Young age, number of infected family members, amount of water consumed daily, and gluten intolerance were identified as risk factors for acquiring cryptosporidiosis. Also, chronic intestinal disease and young age were significantly associated with prolonged diarrhea. Identification of Cryptosporidium hominis subtype IbA10G2 in human and environmental samples and consistently low numbers of oocysts in drinking water confirmed insufficient reduction of parasites by the municipal water treatment plant. The current outbreak shows that use of inadequate microbial barriers at water treatment plants can have serious consequences for public health. This risk can be minimized by optimizing control of raw water quality and employing multiple barriers that remove or inactivate all groups of pathogens.

Microbial Contamination Detection in Water Resources: Interest of Current Optical Methods, Trends and Needs in the Context of Climate Change

Jung, A.V., et al., International Journal of Environmental Research and Public Health, 11(4), 4292-4310, April 2014

Microbial pollution in aquatic environments is one of the crucial issues with regard to the sanitary state of water bodies used for drinking water supply, recreational activities and harvesting seafood due to a potential contamination by pathogenic bacteria, protozoa or viruses. To address this risk, microbial contamination monitoring is usually assessed by turbidity measurements performed at drinking water plants. Some recent studies have shown significant correlations of microbial contamination with the risk of endemic gastroenteresis. However the relevance of turbidimetry may be limited since the presence of colloids in water creates interferences with the nephelometric response. Thus there is a need for a more relevant, simple and fast indicator for microbial contamination detection in water, especially in the perspective of climate change with the increase of heavy rainfall events. This review focuses on the one hand on sources, fate and behavior of microorganisms in water and factors influencing pathogens’ presence, transportation and mobilization, and on the second hand, on the existing optical methods used for monitoring microbiological risks. Finally, this paper proposes new ways of research.

Added Sugar Intake and Cardiovascular Diseases Mortality Among US Adults | Apr 01, 2014 | JAMA Internal Medicine | JAMA Network

Yang, Q. PhD, et al., JAMA Internal Medicine, April 2014

Epidemiologic studies have suggested that higher intake of added sugar is associated with cardiovascular disease (CVD) risk factors. Few prospective studies have examined the association of added sugar intake with CVD mortality. Our objective was to examine time trends of added sugar consumption as percentage of daily calories in the United States and investigate the association of this consumption with CVD mortality. We studied the National Health and Nutrition Examination Survey (NHANES, 1988-1994 [III], 1999-2004, and 2005-2010 [n = 31 147]) for the time trend analysis and NHANES III Linked Mortality cohort (1988-2006 [n = 11 733]), a prospective cohort of a nationally representative sample of US adults for the association study. We measured cardiovascular disease mortality. We found that among US adults, the adjusted mean percentage of daily calories from added sugar increased from 15.7% (95% CI, 15.0%-16.4%) in 1988-1994 to 16.8% (16.0%-17.7%; P = .02) in 1999-2004 and decreased to 14.9% (14.2%-15.5%; P < .001) in 2005-2010. Most adults consumed 10% or more of calories from added sugar (71.4%) and approximately 10% consumed 25% or more in 2005-2010. During a median follow-up period of 14.6 years, we documented 831 CVD deaths during 163 039 person-years. Age-, sex-, and race/ethnicity–adjusted hazard ratios (HRs) of CVD mortality across quintiles of the percentage of daily calories consumed from added sugar were 1.00 (reference), 1.09 (95% CI, 1.05-1.13), 1.23 (1.12-1.34), 1.49 (1.24-1.78), and 2.43 (1.63-3.62; P < .001), respectively. After additional adjustment for sociodemographic, behavioral, and clinical characteristics, HRs were 1.00 (reference), 1.07 (1.02-1.12), 1.18 (1.06-1.31), 1.38 (1.11-1.70), and 2.03 (1.26-3.27; P = .004), respectively. Adjusted HRs were 1.30 (95% CI, 1.09-1.55) and 2.75 (1.40-5.42; P = .004), respectively, comparing participants who consumed 10.0% to 24.9% or 25.0% or more calories from added sugar with those who consumed less than 10.0% of calories from added sugar. These findings were largely consistent across age group, sex, race/ethnicity (except among non-Hispanic blacks), educational attainment, physical activity, health eating index, and body mass index. We concluded that most US adults consume more added sugar than is recommended for a healthy diet. We observed a significant relationship between added sugar consumption and increased risk for CVD mortality.

Opportunistic pathogens in roof-captured rainwater samples, determined using quantitative PCR. – PubMed – NCBI

Ahmed, W., Water Research, April 2014

In this study, quantitative PCR (qPCR) was used for the detection of four opportunistic bacterial pathogens in water samples collected from 72 rainwater tanks in Southeast Queensland, Australia. Tank water samples were also tested for fecal indicator bacteria (Escherichia coli and Enterococcus spp.) using culture-based methods. Among the 72 tank water samples tested, 74% and 94% samples contained E. coli and Enterococcus spp., respectively, and the numbers of E. coli and Enterococcus spp. in tank water samples ranged from 0.3 to 3.7 log₁₀ colony forming units (CFU) per 100 mL of water. In all, 29%, 15%, 13%, and 6% of tank water samples contained Aeromonas hydrophila, Staphylococcus aureus, Pseudomonas aeruginosa and Legionella pneumophila, respectively. The genomic units (GU) of opportunistic pathogens in tank water samples ranged from 1.5 to 4.6 log₁₀ GU per 100 mL of water. A significant correlation was found between E. coli and Enterococcus spp. numbers in pooled tank water samples data (Spearman’s rs = 0.50; P < 0.001). In contrast, fecal indicator bacteria numbers did not correlate with the presence/absence of opportunistic pathogens tested in this study. Based on the results of this study, it would be prudent, to undertake a Quantitative Microbial Risk Assessment (QMRA) analysis of opportunistic pathogens to determine associated health risks for potable and nonpotable uses of tank water.

 

Epidemiology and estimated costs of a large waterborne outbreak of norovirus infection in Sweden

Larsson, C., et al., Epidemiology and Infection, 142(3):592-600, March 2014

A large outbreak of norovirus (NoV) gastroenteritis caused by contaminated municipal drinking water occurred in Lilla Edet, Sweden, 2008. Epidemiological investigations performed using a questionnaire survey showed an association between consumption of municipal drinking water and illness (odds ratio 4·73, 95% confidence interval 3·53-6·32), and a strong correlation between the risk of being sick and the number of glasses of municipal water consumed. Diverse NoV strains were detected in stool samples from patients, NoV genotype I strains predominating. Although NoVs were not detected in water samples, coliphages were identified as a marker of viral contamination. About 2400 (18·5%) of the 13,000 inhabitants in Lilla Edet became ill. Costs associated with the outbreak were collected via a questionnaire survey given to organizations and municipalities involved in or affected by the outbreak. Total costs including sick leave, were estimated to be ∼8,700,000 Swedish kronor (∼€0·87 million).

Methyl Tertiary Butyl Ether (MTBE) and Other Volatile Organic Compounds (VOCs) in Public Water Systems, Private Wells, and Ambient Groundwater Wells in New Jersey Compared to Regulatory and Human-Health Benchmarks

Williams, P.R.D., Environmental Forensics, Volume 15, Issue 1, February 2014

Potential threats to drinking water and water quality continue to be a major concern in many regions of the United States. New Jersey, in particular, has been at the forefront of assessing and managing potential contamination of its drinking water supplies from hazardous substances. The purpose of the current analysis is to provide an up-to-date evaluation of the occurrence and detected concentrations of methyl tertiary butyl ether (MTBE) and several other volatile organic compounds (VOCs) in public water systems, private wells, and ambient groundwater wells in New Jersey based on the best available data, and to put these results into context with federal and state regulatory and human-health benchmarks. Analyses are based on the following three databases that contain water quality monitoring data for New Jersey: Safe Drinking Water Information System (SDWIS), Private Well Testing Act (PWTA), and National Water Information System (NWIS). For public water systems served by groundwater in New Jersey, MTBE was detected at a concentration ≥10 μg/L, ≥20 μg/L, and ≥70 μg/L at least once in 30 (2%), 21 (1.4%), and five (0.3%) of sampled systems from 1997 to 2011, respectively. For private wells in New Jersey, MTBE was detected at a concentration ≥10 μg/L, ≥20 μg/L, and ≥70 μg/L at least once in 385 (0.5%), 183 (0.2%), and 46 (0.05%) of sampled wells from 2001 to 2011, respectively. For ambient groundwater wells in New Jersey, MTBE was detected at a concentration ≥10 μg/L, ≥20 μg/L, and ≥70 μg/L at least once in 14 (2.1%), 9 (1.3%), and 4 (0.6%) of sampled wells from 1993 to 2012, respectively. Average detected concentrations of MTBE, as well as detected concentrations at upper-end percentiles, were less than corresponding benchmarks for all three datasets. The available data show that MTBE is rarely detected in various source waters in New Jersey at a concentration that exceeds the State’s health-based drinking water standard or other published benchmarks, and there is no evidence of an increasing trend in the detection frequency of MTBE. Other VOCs, such as tetrachloroethylene (PCE), trichloroethylene (TCE), and benzene, are detected more often above corresponding regulatory or human-health benchmarks due to their higher detected concentrations in water and/or greater toxicity values. The current analysis provides useful data for evaluating the nature and extent of historical and current contamination of water supplies in New Jersey and potential opportunities for public exposures and health risks due to MTBE and other VOCs on a statewide basis. Additional forensic or forecasting analyses are required to identify the sources or timing of releases of individual contaminants at specific locations or to predict potential future water contamination in New Jersey.

Widespread Molecular Detection of Legionella pneumophila Serogroup 1 in Cold Water Taps across the United States

Donohue, M.J., Environmental Science and Technology, 48 (6), pp 3145–3152, February 2014

In the United States, 6,868 cases of legionellosis were reported to the Center for Disease Control and Prevention in 2009–2010. Of these reports, it is estimated that 84% are caused by the microorganism Legionella pneumophila Serogroup (Sg) 1. Legionella spp. have been isolated and recovered from a variety of natural freshwater environments. Human exposure to L. pneumophila Sg1 may occur from aerosolization and subsequent inhalation of household and facility water. In this study, two primer/probe sets (one able to detect L. pneumophila and the other L. pneumophila Sg1) were determined to be highly sensitive and selective for their respective targets. Over 272 water samples, collected in 2009 and 2010 from 68 public and private water taps across the United States, were analyzed using the two qPCR assays to evaluate the incidence of L. pneumophila Sg1. Nearly half of the taps showed the presence of L. pneumophila Sg1 in one sampling event, and 16% of taps were positive in more than one sampling event. This study is the first United States survey to document the occurrence and colonization of L. pneumophila Sg1 in cold water delivered from point of use taps.

Perspectives on drinking water monitoring for small scale water systems

Roig, B., Baures, E., Thomas, O., Water Science & Technology: Water Supply, Vol. 14 Issue 1, p1, January 2014

Drinking water (DW) is increasingly subject to environmental and human threats that alter the quality of the resource and potentially of the distributed water. These threats can be both biological and chemical in nature, and are often cumulated. The increase of technical frame of water quality monitoring following the evolution of water quality standards guarantee the regulation compliance in general but is not sufficient for the survey of small scale water system efficiency. The existing monitoring is not well suited to insure a good quality of distributed water, especially in the event of a sudden modification of quality. This article aims to propose alternative solutions, from the examination of monitoring practices, in a bid to limit the risk of deterioration of DW quality.

Drinking Water Microbial Myths

Martin, J.A., et al., Critical Reviews in Microbiology, November 2013

Accounts of drinking water-borne disease outbreaks have always captured the interest of the public, elected and health officials, and the media. During the twentieth century, the drinking water community and public health organizations have endeavored to craft regulations and guidelines on treatment and management practices that reduce risks from drinking water, specifically human pathogens. During this period there also evolved misunderstandings as to potential health risk associated with microorganisms that may be present in drinking waters. These misunderstanding or “myths” have led to confusion among the many stakeholders. The purpose of this article is to provide a scientific- and clinically-based discussion of these “myths” and recommendations for better ensuring the microbial safety of drinking water and valid public health decisions.

Assessing the impact of chlorinated-solvent sites on metropolitan groundwater resources

Brusseau, M.L. and Narter, M., Ground Water, November 2013

Chlorinated-solvent compounds are among the most common groundwater contaminants in the United States. A majority of the many sites contaminated by chlorinated-solvent compounds are located in metropolitan areas, and most such areas have one or more chlorinated-solvent contaminated sites. Thus, contamination of groundwater by chlorinated-solvent compounds may pose a potential risk to the sustainability of potable water supplies for many metropolitan areas. The impact of chlorinated-solvent sites on metropolitan water resources was assessed for Tucson, Arizona, by comparing the aggregate volume of extracted groundwater for all pump-and-treat systems associated with contaminated sites in the region to the total regional groundwater withdrawal. The analysis revealed that the aggregate volume of groundwater withdrawn for the pump-and-treat systems operating in Tucson, all of which are located at chlorinated-solvent contaminated sites, was 20% of the total groundwater withdrawal in the city for the study period. The treated groundwater was used primarily for direct delivery to local water supply systems or for reinjection as part of the pump-and-treat system. The volume of the treated groundwater used for potable water represented approximately 13% of the total potable water supply sourced from groundwater, and approximately 6% of the total potable water supply. This case study illustrates the significant impact chlorinated-solvent contaminated sites can have on groundwater resources and regional potable water supplies.

Radon-contaminated drinking water from private wells: an environmental health assessment examining a rural Colorado mountain community’s exposure

Cappello, M.A., et. al., Journal of Environmental Health, November 2013

In the study discussed in this article, 27 private drinking water wells located in a rural Colorado mountain community were sampled for radon contamination and compared against (a) the U.S. Environmental Protection Agency’s (U.S. EPA’s) proposed maximum contaminant level (MCL), (b) the U.S. EPA proposed alternate maximum contaminate level (AMCL), and (c) the average radon level measured in the local municipal drinking water system. The data from the authors’ study found that 100% of the wells within the study population had radon levels in excess of the U.S. EPA MCL, 37% were in excess of the U.S. EPA AMCL, and 100% of wells had radon levels greater than that found in the local municipal drinking water system. Radon contamination in one well was found to be 715 times greater than the U.S. EPA MCL, 54 times greater than the U.S. EPA AMLC, and 36,983 times greater than that found in the local municipal drinking water system. According to the research data and the reviewed literature, the results indicate that this population has a unique and elevated contamination profile and suggest that radon-contaminated drinking water from private wells can present a significant public health concern.

Water and beverage consumption among adults in the United States: cross-sectional study using data from NHANES 2005–2010

Drewnowski, A., et. al., BMC Public Health, November 2013

Few studies have examined plain water consumption among US adults. This study evaluated the consumption of plain water (tap and bottled) and total water among US adults by age group (20-50y, 51-70y, and ≥71y), gender, income-to-poverty ratio, and race/ethnicity. Data from up to two non-consecutive 24-hour recalls from the 2005–2006, 2007–2008 and 2009–2010 National Health and Nutrition Examination Survey (NHANES) was used to evaluate usual intake of water and water as a beverage among 15,702 US adults. The contribution of different beverage types (e.g., water as a beverage [tap or bottled], milk [including flavored], 100% fruit juice, soda/soft drinks [regular and diet], fruit drinks, sports/energy drinks, coffee, tea, and alcoholic beverages) to total water and energy intakes was examined. Total water intakes from plain water, beverages, and food were compared to the Adequate Intake (AI) values from the US Dietary Reference Intakes (DRI). Total water volume per 1,000 kcal was also examined.Water and other beverages contributed 75-84% of dietary water, with 17-25% provided by water in foods, depending on age. Plain water, from tap or bottled sources, contributed 30-37% of total dietary water. Overall, 56% of drinking water volume was from tap water while bottled water provided 44%. Older adults (≥71y) consumed much less bottled water than younger adults. Non-Hispanic whites consumed the most tap water, whereas Mexican-Americans consumed the most bottled water. Plain water consumption (bottled and tap) tended to be associated with higher incomes. On average, younger adults exceeded or came close to satisfying the DRIs for water. Older men and women failed to meet the Institute of Medicine (IOM) AI values, with a shortfall in daily water intakes of 1218 mL and 603 mL respectively. Eighty-three percent of women and 95% of men ≥71y failed to meet the IOM AI values for water. However, average water volume per 1,000 kcal was 1.2-1.4 L/1,000 kcal for most population sub-groups, higher than suggested levels of 1.0 L/1.000 kcal. It was concluded that water intakes below IOM-recommended levels may be a cause for concern, especially for older adults.

Microbial Health Risks of Regulated Drinking Water in the United States

Edberg, S.C., DWRF, September 2013

Drinking water regulations are designed to protect the public health. In the United States, the Environmental Protection Agency (EPA) is tasked with developing and maintaining drinking water regulations for the 276,607,387 people served by the country’s 54,293 community water systems. The Food and Drug Administration (FDA) regulates bottled water as a food product. By federal law, the FDA’s regulations for bottled water must be at least as protective of public health as the EPA’s regulations for public water system drinking water. Despite many similarities in EPA and FDA regulations, consumer perception regarding the safety of drinking waters varies widely. This paper examines and compares the microbial health risks of tap water and bottled water, specifically examining differences in quality monitoring, regulatory standards violations, advisories, and distribution system conditions. It also includes comparison data on the number of waterborne illness outbreaks caused by both tap and bottled water. Based on a review of existing research, it is clear that as a consequence of the differences in regulations, distribution systems, operating (manufacturing) practices, and microbial standards of quality, public drinking water supplies present a substantially higher human risk than do bottled waters for illness due to waterborne organisms.

The mineral content of tap water in United States households

Patterson, K.Y., et. al., Journal of Food Composition and Analysis, August 2013

The composition of tap water contributes to dietary intake of minerals. The Nutrient Data Laboratory (NDL) of the United States Department of Agriculture (USDA) conducted a study of the mineral content of residential tap water, to generate current data for the USDA National Nutrient Database. Sodium, potassium, calcium, magnesium, iron, copper, manganese, phosphorus, and zinc content of drinking water were determined in a nationally representative sampling. The statistically designed sampling method identified 144 locations for water collection in winter and spring from home taps. Assuming a daily consumption of 1 L of tap water, only four minerals (Cu, Ca, Mg, and Na), on average, provided more than 1% of the US dietary reference intake. Significant decreases in calcium were observed with chemical water softeners, and between seasonal pickups for Mg and Ca. The variance of sodium was significantly different among regions (p < 0.05) but no differences were observed as a result of collection time, water source or treatment. Based on the weighted mixed model results, there were no significant differences in overall mineral content between municipal and well water. These results, which are a nationally representative dataset of mineral values for drinking water available from home taps, provides valuable additional information for assessment of dietary mineral intake.

Quantitative analysis of microbial contamination in private drinking water supply systems

Allevi, R.P., et al., Journal of Water and Health, June 2013

Over one million households rely on private water supplies (e.g. well, spring, cistern) in the Commonwealth of Virginia, USA. The present study tested 538 private wells and springs in 20 Virginia counties for total coliforms (TCs) and Escherichia coli along with a suite of chemical contaminants. A logistic regression analysis was used to investigate potential correlations between TC contamination and chemical parameters (e.g. NO3(-), turbidity), as well as homeowner-provided survey data describing system characteristics and perceived water quality. Of the 538 samples collected, 41% (n = 221) were positive for TCs and 10% (n = 53) for E. coli. Chemical parameters were not statistically predictive of microbial contamination. Well depth, water treatment, and farm location proximate to the water supply were factors in a regression model that predicted presence/absence of TCs with 74% accuracy. Microbial and chemical source tracking techniques (Bacteroides gene Bac32F and HF183 detection via polymerase chain reaction and optical brightener detection via fluorometry) identified four samples as likely contaminated with human wastewater.

Water and beverage consumption among children age 4-13y in the United States: analyses of 2005–2010 NHANES data

Drewnowski, A., Rehm, C., and Constant, F., Nutrition Journal, June 2013

Few studies have examined water consumption patterns among US children. Additionally, recent data on total water consumption as it relates to the Dietary Reference Intakes (DRI) are lacking. This study evaluated the consumption of plain water (tap and bottled) and other beverages among US children by age group, gender, income-to-poverty ratio, and race/ethnicity. Comparisons were made to DRI values for water consumption from all sources. Data from two non-consecutive 24-hour recalls from 3 cycles of NHANES (2005–2006, 2007–2008 and 2009–2010) were used to assess water and beverage consumption among 4,766 children age 4-13y. Beverages were classified into 9 groups: water (tap and bottled), plain and flavored milk, 100% fruit juice, soda/soft drinks (regular and diet), fruit drinks, sports drinks, coffee, tea, and energy drinks. Total water intakes from plain water, beverages, and food were compared to DRIs for the US. Total water volume per 1,000 kcal was also examined. It was found that water and other beverages contributed 70-75% of dietary water, with 25-30% provided by moisture in foods, depending on age. Plain water, tap and bottled, contributed 25-30% of total dietary water. In general, tap water represented 60% of drinking water volume whereas bottled water represented 40%. Non-Hispanic white children consumed the most tap water, whereas Mexican-American children consumed the most bottled water. Plain water consumption (bottled and tap) tended to be associated with higher incomes. No group of US children came close to satisfying the DRIs for water. At least 75% of children 4-8y, 87% of girls 9-13y, and 85% of boys 9-13y did not meet DRIs for total water intake. Water volume per 1,000 kcal, another criterion of adequate hydration, was 0.85-0.95 L/1,000 kcal, short of the desirable levels of 1.0-1.5 L/1,000 kcal. It was concluded that water intakes at below-recommended levels may be a cause for concern. Data on water and beverage intake for the population and by socio-demographic group provides useful information to target interventions for increasing water intake among children.

Resolved: there is sufficient scientific evidence that decreasing sugar-sweetened beverage consumption will reduce the prevalence of obesity and obesity-related diseases

Hu, F.B., Obesity Reviews, June 2013

Sugar-sweetened beverages (SSBs) are the single largest source of added sugar and the top source of energy intake in the U.S. diet. In this review, we evaluate whether there is sufficient scientific evidence that decreasing SSB consumption will reduce the prevalence of obesity and its related diseases. Because prospective cohort studies address dietary determinants of long-term weight gain and chronic diseases, whereas randomized clinical trials (RCTs) typically evaluate short-term effects of specific interventions on weight change, both types of evidence are critical in evaluating causality. Findings from well-powered prospective cohorts have consistently shown a significant association, established temporality and demonstrated a direct dose–response relationship between SSB consumption and long-term weight gain and risk of type 2 diabetes (T2D). A recently published meta-analysis of RCTs commissioned by the World Health Organization found that decreased intake of added sugars significantly reduced body weight (0.80 kg, 95% confidence interval [CI] 0.39–1.21; P < 0.001), whereas increased sugar intake led to a comparable weight increase (0.75 kg, 0.30–1.19; P = 0.001). A parallel meta-analysis of cohort studies also found that higher intake of SSBs among children was associated with 55% (95% CI 32–82%) higher risk of being overweight or obese compared with those with lower intake. Another meta-analysis of eight prospective cohort studies found that one to two servings per day of SSB intake was associated with a 26% (95% CI 12–41%) greater risk of developing T2D compared with occasional intake (less than one serving per month). Recently, two large RCTs with a high degree of compliance provided convincing data that reducing consumption of SSBs significantly decreases weight gain and adiposity in children and adolescents. Taken together, the evidence that decreasing SSBs will decrease the risk of obesity and related diseases such as T2D is compelling. Several additional issues warrant further discussion. First, prevention of long-term weight gain through dietary changes such as limiting consumption of SSBs is more important than short-term weight loss in reducing the prevalence of obesity in the population. This is due to the fact that once an individual becomes obese, it is difficult to lose weight and keep it off. Second, we should consider the totality of evidence rather than selective pieces of evidence (e.g. from short-term RCTs only). Finally, while recognizing that the evidence of harm on health against SSBs is strong, we should avoid the trap of waiting for absolute proof before allowing public health action to be taken.

Impact of fluid intake in the prevention of urinary system diseases: a brief review

Lotan, et al., Current Opinions in Nephrology and Hypertension, Vol. 22, sup. 1, May 2013

We are often told that we should be drinking more water, but the rationale for this remains unclear and no recommendations currently exist for a healthy fluid intake supported by rigorous scientific evidence. Crucially, the same lack of evidence precludes the claim that a high fluid intake has no clinical benefit. The aim of this study is to describe the mechanisms by which chronic low fluid intake may play a crucial role in the pathologies of four key diseases of the urinary system: urolithiasis, urinary tract infection, chronic kidney disease and bladder cancer. Although primary and secondary intervention studies evaluating the impact of fluid intake are lacking, published data from observational studies appears to suggest that chronic low fluid intake may be an important factor in the pathogenesis of these diseases.

Relation between urinary hydration biomarkers and total fluid intake in healthy adults

Perrier, E., et al., European Journal of Clinical Nutrition,  May 2013

In sedentary adults, hydration is mostly influenced by total fluid intake and not by sweat losses; moreover, low daily fluid intake is associated with adverse health outcomes. This study aimed to model the relation between total fluid intake and urinary hydration biomarkers. During 4 consecutive weekdays, 82 adults (age, 31.6±4.3 years; body mass index, 23.2±2.7 kg/m2; 52% female) recorded food and fluid consumed, collected one first morning urine (FMU) void and three 24-h (24hU) samples. The strength of linear association between urinary hydration biomarkers and fluid intake volume was evaluated using simple linear regression and Pearson’s correlation. Multivariate partial least squares (PLS) modeled the association between fluid intake and 24hU hydration biomarkers. Strong associations (|r|greater than or equal to0.6; P<0.001) were found between total fluid intake volume and 24hU osmolality, color, specific gravity (USG), volume and solute concentrations. Many 24hU biomarkers were collinear (osmolality versus color: r=0.49–0.76; USG versus color: r=0.46–0.78; osmolality versus USG: 0.86–0.97; P<0.001). Measures in FMU were not strongly correlated to intake. Multivariate PLS and simple linear regression using urine volume explained >50% of the variance in fluid intake volume (r2=0.59 and 0.52, respectively); however the error in both models was high and the limits of agreement very large. It was concluded that hydration biomarkers in 24hU are strongly correlated with daily total fluid intake volume in sedentary adults in free-living conditions; however, the margin of error in the present models limits the applicability of estimating fluid intake from urinary biomarkers.

Strontium Concentrations in Corrosion Products from Residential Drinking Water Distribution Systems

Gerke, et al., Environmental Science and Technology, April 22, 2013.

The United States Environmental Protection Agency (US EPA) will require some U.S. drinking water distribution systems (DWDS) to monitor nonradioactive strontium (Sr2+) in drinking water in 2013. Iron corrosion products from four DWDS were examined to assess the potential for Sr2+ binding and release. Average Sr2+ concentrations in the outermost layer of the corrosion products ranged from 3 to 54 mg kg–1 and the Sr2+ drinking water concentrations were all ≤0.3 mg L–1. Micro-X-ray adsorption near edge structure spectroscopy and linear combination fitting determined that Sr2+ was principally associated with CaCO3. Sr2+ was also detected as a surface complex associated with α-FeOOH. Iron particulates deposited on a filter inside a home had an average Sr2+ concentration of 40.3 mg kg–1 and the associated drinking water at a tap was 210 μg L–1. The data suggest that elevated Sr2+ concentrations may be associated with iron corrosion products that, if disturbed, could increase Sr2+ concentrations above the 0.3 μg L–1 US EPA reporting threshold. Disassociation of very small particulates could result in drinking water Sr2+ concentrations that exceed the US EPA health reference limit (4.20 mg kg–1 body weight).

Association between Water Intake, CKD, and Cardiovascular Disease: A Cross-Sectional Analysis of NHANES Data

Sontrop, et al., American Journal of Nephrology, 37:434-442, April 2013

Recent evidence from animal and human studies suggests that a higher water intake may have a protective effect on kidney function and cardiovascular disease. We wish to examine the association between water intake, chronic kidney disease and cardiovascular disease in a cross-sectional analysis of the 2005-2006 National Health and Nutrition Examination Survey Population. Total water intake from food and beverages was categorized as low – that is less than 2 litres per day, moderate – 2 to 4.3 litres per day and high – greater than 4.3 litres per day. We examined the associations between the low total water intake and chronic kidney disease and self-reported cardiovascular disease. Key Findings: Of the 3427 adults, whose mean age was 46, with a mean eGFR of 95ml/min/1.73m2, 13% had chronic kidney disease and 18% suffered cardiovascular disease. Chronic kidney disease was higher among those with the lowest (less than 2 litres of fluid per day) versus the highest total water intake (greater than 4.3 litres per day), (odds ratio 2.52, 95% confidence interval, 0.91-6.96). Once stratified by the intake of plain water and other beverages, CKD was associated with a low intake of plain water with an odds ratio of 2.36 at 95% confidence intervals of 1.1-5.06 but not other beverages. There was no association between low water intake and cardiovascular disease.

Evaluating violations of drinking water regulations

Rubin, S.J., Journal, American Water Works Association, March 2013

US Environmental Protection Agency data were analyzed for violations by community water systems (CWSs). Several characteristics were evaluated, including size, source water, and violation type. The data show that: (1) 55% of CWSs violated at least one regulation under the Safe Drinking Water Act that involved systems serving more than 95 million people; (2) the presence of violations was no different for groundwater and surface water systems; (3) fewer than 20% of CWSs with violations exceeded an allowable level of a contaminant in drinking water; (4) smaller water systems are no more likely than larger systems, except very large systems, to violate health-related requirements; and (5) smaller CWSs appear more likely than larger systems to violate monitoring, reporting, and notification requirements. An evaluation was also conducted of four contaminants that had health-related violations by more than 1% of CWSs: total coliform, stage 1 disinfection by-products, arsenic, and lead and copper.

Lead (Pb) quantification in potable water samples: implications for regulatory compliance and assessment of human exposure

Triantafyllidou, S., et al., Environmental Monitoring and Assessment, February 2013

Assessing the health risk from lead (Pb) in potable water requires accurate quantification of the Pb concentration. Under worst-case scenarios of highly contaminated water samples, representative of public health concerns, up to 71-98 % of the total Pb was not quantified if water samples were not mixed thoroughly after standard preservation (i.e., addition of 0.15 % (v/v) HNO(3)). Thorough mixing after standard preservation improved recovery in all samples, but 35-81 % of the total Pb was still un-quantified in some samples. Transfer of samples from one bottle to another also created high errors (40-100 % of the total Pb was un-quantified in transferred samples). Although the United States Environmental Protection Agency’s standard protocol avoids most of these errors, certain methods considered EPA-equivalent allow these errors for regulatory compliance sampling. Moreover, routine monitoring for assessment of human Pb exposure in the USA has no standardized protocols for water sample handling and pre-treatment. Overall, while there is no reason to believe that sample handling and pre-treatment dramatically skew regulatory compliance with the US Pb action level, slight variations from one approved protocol to another may cause Pb-in-water health risks to be significantly underestimated, especially for unusual situations of “worst case” individual exposure to highly contaminated water.

Blood pressure hyperreactivity: an early cardiovascular risk in normotensive men exposed to low-to-moderate inorganic arsenic in drinking water

Kunrath, J., et al., Journal of Hypertension, February 2013

Essential hypertension is associated with chronic exposure to high levels of inorganic arsenic in drinking water. However, early signs of risk for developing hypertension remain unclear in people exposed to chronic low-to-moderate inorganic arsenic. We evaluated cardiovascular stress reactivity and recovery in healthy, normotensive, middle-aged men living in an arsenic-endemic region of Romania. Unexposed (n = 16) and exposed (n = 19) participants were sampled from communities based on WHO limits for inorganic arsenic in drinking water (20 mmHg) and DBP (>15 mmHg). We found that drinking water inorganic arsenic averaged 40.2 ± 30.4 and 1.0 ± 0.2 μg/l for the exposed and unexposed groups, respectively (P < 0.001). Compared to the unexposed group, the exposed group expressed a greater probability of blood pressure hyperreactivity to both anticipatory stress (47.4 vs. 12.5%; P = 0.035) and cold stress (73.7 vs. 37.5%; P = 0.044). Moreover, the exposed group exhibited attenuated blood pressure recovery from stress and a greater probability of persistent hypertensive responses (47.4 vs. 12.5%; P = 0.035). We concluded inorganic arsenic exposure increased stress-induced blood pressure hyperreactivity and poor blood pressure recovery, including persistent hypertensive responses in otherwise healthy, clinically normotensive men. Drinking water containing even low-to-moderate inorganic arsenic may act as a sympathetic nervous system trigger for hypertension risk.

Water consumption, not expectancies about water consumption, affects cognitive performance in adults

Edmonds, C.J., et al., Appetite, January 2013

Research has shown that water supplementation positively affects cognitive performance in children and adults. The present study considered whether this could be a result of expectancies that individuals have about the effects of water on cognition. Forty-seven participants were recruited and told the study was examining the effects of repeated testing on cognitive performance. They were assigned either to a condition in which positive expectancies about the effects of drinking water were induced, or a control condition in which no expectancies were induced. Within these groups, approximately half were given a drink of water, while the remainder were not. Performance on a thirst scale, letter cancellation, digit span forwards and backwards and a simple reaction time task was assessed at baseline (before the drink) and 20 min and 40 min after water consumption. Effects of water, but not expectancy, were found on subjective thirst ratings and letter cancellation task performance, but not on digit span or reaction time. This suggests that water consumption effects on letter cancellation are due to the physiological effects of water, rather than expectancies about the effects of drinking water.

Changes in water and beverage intake and long-term weight changes: results from three prospective cohort studies

Pan, A., et al., International Journal of Obesity, January 2013

We aimed to examine the long-term relationship between changes in water and beverage intake and weight change. Our subjects were prospective cohort studies of 50 013 women aged 40–64 years in the Nurses’ Health Study (NHS, 1986–2006), 52 987 women aged 27–44 years in the NHS II (1991–2007) and 21 988 men aged 40–64 years in the Health Professionals Follow-up Study (1986–2006) without obesity and chronic diseases at baseline.We assessed the association of weight change within each 4-year interval, with changes in beverage intakes and other lifestyle behaviors during the same period. Multivariate linear regression with robust variance and accounting for within-person repeated measures were used to evaluate the association. Results across the three cohorts were pooled by an inverse-variance-weighted meta-analysis.We found participants gained an average of 1.45 kg (5th to 95th percentile: −1.87 to 5.46) within each 4-year period. After controlling for age, baseline body mass index and changes in other lifestyle behaviors (diet, smoking habits, exercise, alcohol, sleep duration, TV watching), each 1 cup per day increment of water intake was inversely associated with weight gain within each 4-year period (−0.13 kg; 95% confidence interval (CI): −0.17 to −0.08). The associations for other beverages were: sugar-sweetened beverages (SSBs) (0.36 kg; 95% CI: 0.24–0.48), fruit juice (0.22 kg; 95% CI: 0.15–0.28), coffee (−0.14 kg; 95% CI: −0.19 to −0.09), tea (−0.03 kg; 95% CI: −0.05 to −0.01), diet beverages (−0.10 kg; 95% CI: −0.14 to −0.06), low-fat milk (0.02 kg; 95% CI: −0.04 to 0.09) and whole milk (0.02 kg; 95% CI: −0.06 to 0.10). We estimated that replacement of 1 serving per day of SSBs by 1 cup per day of water was associated with 0.49 kg (95% CI: 0.32–0.65) less weight gain over each 4-year period, and the replacement estimate of fruit juices by water was 0.35 kg (95% CI: 0.23–0.46). Substitution of SSBs or fruit juices by other beverages (coffee, tea, diet beverages, low-fat and whole milk) were all significantly and inversely associated with weight gain. Our results suggest that increasing water intake in place of SSBs or fruit juices is associated with lower long-term weight gain.

Influence of progressive fluid restriction on mood and physiological markers of dehydration in women

Pross, N., et al., British Journal of Nutrition, 109, 313-321, January 2013

The present study evaluated, using a well-controlled dehydration protocol, the effects of 24 h fluid deprivation (FD) on selected mood and physiological parameters. In the present cross-over study, twenty healthy women (age 25 (SE 0·78) years) participated in two randomised sessions: FD-induced dehydration v. a fully hydrated control condition. In the FD period, the last water intake was between 18.00 and 19.00 hours and no beverages were allowed until 18.00 hours on the next day (23–24 h). Water intake was only permitted at fixed periods during the control condition. Physiological parameters in the urine, blood and saliva (osmolality) as well as mood and sensations (headache and thirst) were compared across the experimental conditions. Safety was monitored throughout the study. The FD protocol was effective as indicated by a significant reduction in urine output. No clinical abnormalities of biological parameters or vital signs were observed, although heart rate was increased by FD. Increased urine specific gravity, darker urine colour and increased thirst were early markers of dehydration. Interestingly, dehydration also induced a significant increase in saliva osmolality at the end of the 24 h FD period but plasma osmolality remained unchanged. The significant effects of FD on mood included decreased alertness and increased sleepiness, fatigue and confusion. The most consistent effects of mild dehydration on mood are on sleep/wake parameters. Urine specific gravity appears to be the best physiological measure of hydration status in subjects with a normal level of activity; saliva osmolality is another reliable and noninvasive method for assessing hydration status.

The need for congressional action to finance arsenic reductions in drinking water

Levine, R.L., Journal of Environmental Health, November 2012

Many public water systems in the U.S. are unsafe because the communities cannot afford to comply with the current 10 parts per billion (ppb) federal arsenic standard for drinking water. Communities unable to afford improvements remain vulnerable to adverse health effects associated with higher levels of arsenic exposure. Scientific and bipartisan political consensus exists that the arsenic standard should not be less stringent than 10 ppb, and new data suggest additional adverse health effects related to arsenic exposure through drinking water. Congress has failed to reauthorize the Drinking Water State Revolving Fund program to provide reliable funding to promote compliance and reduce the risk of adverse health effects. Congress’s recent ad hoc appropriations do not allow long-term planning and ongoing monitoring and maintenance. Investing in water infrastructure will lower health care costs and create American jobs. Delaying necessary upgrades will only increase the costs of improvements over time.

Direct healthcare costs of selected diseases primarily or partially transmitted by water

Collier, S.A., et al., Epidemiology and Infection, November 2012

Despite US sanitation advancements, millions of waterborne disease cases occur annually, although the precise burden of disease is not well quantified. Estimating the direct healthcare cost of specific infections would be useful in prioritizing waterborne disease prevention activities. Hospitalization and outpatient visit costs per case and total US hospitalization costs for ten waterborne diseases were calculated using large healthcare claims and hospital discharge databases. The five primarily waterborne diseases in this analysis (giardiasis, cryptosporidiosis, Legionnaires’ disease, otitis externa, and non-tuberculous mycobacterial infection) were responsible for over 40 000 hospitalizations at a cost of $970 million per year, including at least $430 million in hospitalization costs for Medicaid and Medicare patients. An additional 50 000 hospitalizations for campylobacteriosis, salmonellosis, shigellosis, haemolytic uraemic syndrome, and toxoplasmosis cost $860 million annually ($390 million in payments for Medicaid and Medicare patients), a portion of which can be assumed to be due to waterborne transmission.

Clues to the Future of the Park Doctrine

Burroughs, A.D., and Rin, D., Food and Drug Law Institute, November/December 2012

This article examines three recent cases brought under the controversial Park doctrine in search of clues to the doctrine’s future. The responsible corporate officer (RCO) doctrine, also known as the Park doctrine, allows for criminal prosecution of individuals, typically high-ranking corporate executives of pharmaceutical companies, for violations of the Food, Drug and Cosmetic Act (FDCA), even absent any proof of the individual defendant’s knowledge of or participation in the violation. It is relevant to drinking water because the Park law applies to bottled water, but not to tap water.

Arcobacter in Lake Erie Beach Waters: an Emerging Gastrointestinal Pathogen Linked with Human-Associated Fecal Contamination

Lee, C., et al., Applied and Environmental Microbiology, September 2012

The genus Arcobacter has been associated with human illness and fecal contamination by humans and animals. To better characterize the health risk posed by this emerging waterborne pathogen, we investigated the occurrence of Arcobacter spp. in Lake Erie beach waters. During the summer of 2010, water samples were collected 35 times from the Euclid, Villa Angela, and Headlands (East and West) beaches, located along Ohio’s Lake Erie coast. After sample concentration, Arcobacter was quantified by real-time PCR targeting the Arcobacter 23S rRNA gene. Other fecal genetic markers (Bacteroides 16S rRNA gene [HuBac], Escherichia coli uidA gene, Enterococcus 23S rRNA gene, and tetracycline resistance genes) were also assessed. Arcobacter was detected frequently at all beaches, and both the occurrence and densities of Arcobacter spp. were higher at the Euclid and Villa Angela beaches (with higher levels of fecal contamination) than at the East and West Headlands beaches. The Arcobacter density in Lake Erie beach water was significantly correlated with the human-specific fecal marker HuBac according to Spearman’s correlation analysis (r = 0.592; P < 0.001). Phylogenetic analysis demonstrated that most of the identified Arcobacter sequences were closely related to Arcobacter cryaerophilus, which is known to cause gastrointestinal diseases in humans. Since human-pathogenic Arcobacter spp. are linked to human-associated fecal sources, it is important to identify and manage the human-associated contamination sources for the prevention of Arcobacter-associated public health risks at Lake Erie beaches.

Sugar-Sweetened Beverages and Genetic Risk of Obesity

Qi, Q. PhD, et al., The New England Journal of Medicine, September 2012

Temporal increases in the consumption of sugar-sweetened beverages have paralleled the rise in obesity prevalence, but whether the intake of such beverages interacts with the genetic predisposition to adiposity is unknown. We analyzed the interaction between genetic predisposition and the intake of sugar-sweetened beverages in relation to body-mass index (BMI; the weight in kilograms divided by the square of the height in meters) and obesity risk in 6934 women from the Nurses’ Health Study (NHS) and in 4423 men from the Health Professionals Follow-up Study (HPFS) and also in a replication cohort of 21,740 women from the Women’s Genome Health Study (WGHS). The genetic-predisposition score was calculated on the basis of 32 BMI-associated loci. The intake of sugar-sweetened beverages was examined prospectively in relation to BMI. In the NHS and HPFS cohorts, the genetic association with BMI was stronger among participants with higher intake of sugar-sweetened beverages than among those with lower intake. In the combined cohorts, the increases in BMI per increment of 10 risk alleles were 1.00 for an intake of less than one serving per month, 1.12 for one to four servings per month, 1.38 for two to six servings per week, and 1.78 for one or more servings per day.

The Quality of Drinking Water in North Carolina Farmworker Camps

Bischoff, W.E., MD, PhD, et al.American Journal of Public Health, August 2012

The purpose of this study was to assess water quality in migrant farmworker camps in North Carolina and determine associations of water quality with migrant farmworker housing characteristics. Researchers collected data from 181 farmworker camps in eastern North Carolina during the 2010 agricultural season. Water samples were tested using the Total Coliform Rule (TCR) and housing characteristics were assessed using North Carolina Department of Labor standards. A total of 61 (34%) of 181 camps failed the TCR. Total coliform bacteria were found in all 61 camps, with Escherichia coli also being detected in 2. Water quality was not associated with farmworker housing characteristics or with access to registered public water supplies. Multiple official violations of water quality standards had been reported for the registered public water supplies. They concluded that water supplied to farmworker camps often does not comply with current standards and poses a great risk to the physical health of farmworkers and surrounding communities. Expansion of water monitoring to more camps and changes to the regulations such as testing during occupancy and stronger enforcement are needed to secure water safety.

Chemical mixtures in untreated water from public-supply wells in the U.S. — Occurrence, composition, and potential toxicity

Toccalino, P.L., Norman, J.E., Scott, J.C., Science of The Total Environment, August 2012

Chemical mixtures are prevalent in groundwater used for public water supply, but little is known about their potential health effects. As part of a large-scale ambient groundwater study, we evaluated chemical mixtures across multiple chemical classes, and included more chemical contaminants than in previous studies of mixtures in public-supply wells. We (1) assessed the occurrence of chemical mixtures in untreated source-water samples from public-supply wells, (2) determined the composition of the most frequently occurring mixtures, and (3) characterized the potential toxicity of mixtures using a new screening approach. The U.S. Geological Survey collected one untreated water sample from each of 383 public wells distributed across 35 states, and analyzed the samples for as many as 91 chemical contaminants. Concentrations of mixture components were compared to individual human-health benchmarks; the potential toxicity of mixtures was characterized by addition of benchmark-normalized component concentrations. Most samples (84%) contained mixtures of two or more contaminants, each at concentrations greater than one-tenth of individual benchmarks. The chemical mixtures that most frequently occurred and had the greatest potential toxicity primarily were composed of trace elements (including arsenic, strontium, or uranium), radon, or nitrate. Herbicides, disinfection by-products, and solvents were the most common organic contaminants in mixtures. The sum of benchmark-normalized concentrations was greater than 1 for 58% of samples, suggesting that there could be potential for mixtures toxicity in more than half of the public-well samples. Our findings can be used to help set priorities for groundwater monitoring and suggest future research directions for drinking-water treatment studies and for toxicity assessments of chemical mixtures in water resources.

Risk of Viral Acute Gastrointestinal Illness from Nondisinfected Drinking Water Distribution Systems

Lambertini, E., et al., Environmental Science and Technology, July 2012

Acute gastrointestinal illness (AGI) resulting from pathogens directly entering the piping of drinking water distribution systems is insufficiently understood. Here, we estimate AGI incidence from virus intrusions into the distribution systems of 14 nondisinfecting, groundwater-source, community water systems. Water samples for virus quantification were collected monthly at wells and households during four 12-week periods in 2006–2007. Ultraviolet (UV) disinfection was installed on the communities’ wellheads during one study year; UV was absent the other year. UV was intended to eliminate virus contributions from the wells and without residual disinfectant present in these systems, any increase in virus concentration downstream at household taps represented virus contributions from the distribution system (Approach 1). During no-UV periods, distribution system viruses were estimated by the difference between well water and household tap virus concentrations (Approach 2). For both approaches, a Monte Carlo risk assessment framework was used to estimate AGI risk from distribution systems using study-specific exposure–response relationships. Depending on the exposure–response relationship selected, AGI risk from the distribution systems was 0.0180–0.0661 and 0.001–0.1047 episodes/person-year estimated by Approaches 1 and 2, respectively. These values represented 0.1–4.9% of AGI risk from all exposure routes, and 1.6–67.8% of risk related to drinking water exposure. Virus intrusions into nondisinfected drinking water distribution systems can contribute to sporadic AGI.

First Findings of the United Kingdom Fluid Intake Study

Gandy, J. PhD, RD, Nutrition Today, July/August 2012

Many factors influence the rising levels of obesity including changes in activity and dietary patterns. National dietary surveys are valuable tools for identifying sources of energy, including sugar, in a population. However, these surveys use diaries that aim to capture food intake rather than fluid intake and may underestimate beverage and therefore energy intakes. A study of 1456 children and adults was conducted in the United Kingdom using a 7-day fluid specific intake diary. Total daily intakes of beverages were higher in all ages compared with previous surveys. However, 30% of adults and more than 50% of children did not meet European adequate intake for total water. Children consumed on average 175 kcal/d as still or carbonated soft drinks. This fluid-specific survey raises concerns about the total and type of fluids consumed by both adults and children in the United Kingdom.

 

Fluids, Water, and Nutrients and the Risk of Renal Diseases: Where We Stand and a Research Agenda

Strippoli, G.F.M. MD, PhD, MPH, MM, Nutrition Today, July/August 2012

Chronic kidney disease (CKD) is a major public health challenge. Despite identification of several established cardiovascular and renal risk factors and addressing them with multiple pharmacological interventions, people with CKD continue to die, and the rate of progression of kidney disease continues to increase. In this article, we review existing evidence on the role of fluid (total fluid including fluid from water and fluid from food) and nutrient intake and the risk of kidney disease and its progression and propose a research agenda for future studies in the area. It is plausible that water and nutrient intake is an easy-to-implement strategy to reduce the risk of CKD and its progression and adverse outcomes at a population level. Cross-sectional and prospective cohort studies first and subsequently randomized trials are needed to establish the strength of association between fluid/water/nutrient intake and risk of CKD and adverse outcomes and whether a causal link exists between these exposures and the adverse outcomes.

 

Promotion and Provision of Drinking Water in Schools for Overweight Prevention: Randomized, Controlled Cluster Trial

Muckelbauer, R. MSc, et al., Nutrition Today, July/August 2012

The prevention of childhood overweight is a major public health challenge. Intervention trials have shown that schools are a promising setting for overweight prevention. To date, no particular intervention has been proved to be effective in overweight prevention. This study showed that a simple intervention with the sole focus of promoting water consumption effectively prevented overweight among children in elementary schools in socially deprived urban areas. The study tested whether a combined environmental and educational intervention solely promoting water consumption was effective in preventing overweight among children in elementary school. The participants in this randomized, controlled cluster trial were second- and third-graders from 32 elementary schools in socially deprived areas of 2 German cities. Water fountains were installed and teachers presented 4 prepared classroom lessons in the intervention group schools (N = 17) to promote water consumption. Control group schools (N = 15) did not receive any intervention. The prevalence of overweight (defined according to the International Obesity Task Force criteria), BMI SD scores, and beverage consumption (in glasses per day; 1 glass was defined as 200 mL) self-reported in 24-hour recall questionnaires, were determined before (baseline) and after the intervention. In addition, the water flow of the fountains was measured during the intervention period of 1 school year (August 2006 to June 2007). Data on 2950 children (intervention group: N = 1641; control group: N = 1309; age, mean ± SD: 8.3 ± 0.7 years) were analyzed. After the intervention, the risk of overweight was reduced by 31% in the intervention group, compared with the control group, with adjustment for baseline prevalence of overweight and clustering according to school. Changes in BMI SD scores did not differ between the intervention group and the control group. Water consumption after the intervention was 1.1 glasses per day greater in the intervention group. No intervention effect on juice and soft drink consumption was found. Daily water flow of the fountains indicated lasting use during the entire intervention period, but to varying extent. Our environmental and educational, school-based intervention proved to be effective in the prevention of overweight among children in elementary school, even in a population from socially deprived areas.

Abbreviations: CG—control group, CI—confidence interval, IG—intervention group, SDS—SD score

 

Hydration Biomarkers During Daily Life: Recent Advances and Future Potential

Armstrong, L.E. PhD, Nutrition Today, July/August 2012

Human fluid-electrolyte regulation involves multiple neuroendocrine responses to the hourly loss and gain of body water. Such dynamic complexity complicates hydration assessment and minimizes the likelihood that any single biomarker will validly and precisely describe hydration status in all life situations. This article describes the biomarkers that are currently used during daily living to assess mild dehydration, plus recent advances in our understanding of invasive and noninvasive techniques. This article also suggests directions for future exploration of novel hydration indices, in the belief that superior biomarkers exist but have not been discovered.

 

Hydration Status in Active Youth

Kavouras, S. A. PhD, Arnaoutis, G. PhD, Nutrition Today, July/August 2012

Fluid balance is crucial for maintaining health. It is well documented that dehydration increases physiologic strain and decreases athletic performance, especially in hot environments. Although there are numerous studies evaluating hydration status in adults, limited data concerning hydration levels in athletic youth exist. Nevertheless, most of these studies clearly indicate that (a) dehydration is a major and common problem within children exercising in the heat; and (b) children do not have the capacity to translate hydration awareness to successful hydration strategies. Further research is needed, and constant efforts must be made toward the development of more efficient hydration strategies in order to educate young people about the benefits of optimal hydration status.

 

Effect of a 24-Hour Fluid Deprivation on Mood and Physiological Hydration Markers in Women

Pross, N. PhD, Nutrition Today, July/August 2012

The study aim was to evaluate the effect of an acute fluid deprivation (FD) on mood and physiological parameters. Twenty healthy women (aged 25 ± 3.5 years) participated in a randomized 2-period (dehydrated vs control) crossover study. In the FD period, the last water intake was between 6 PM and 7 PM, and no fluid intake was allowed up to 6 PM on the next day. The FD resulted in increased sleepiness and fatigue, decreased alertness, and increased confusion. In this rigorously controlled protocol, the early noninvasive markers of dehydration were a reduced urine volume, increased urinary gravity, darker urine color, and increased thirst. Interestingly, dehydration also induced a significant increase in saliva osmolality at the end FD period. Plasma osmolality did not differ between experimental conditions.

 

Impact of Beverage Content on Health and the Kidneys

Johnson, R.J. MD, et al., Nutrition Today, July/August 2012

The last 50 years have witnessed an epidemic rise in obesity, diabetes, high blood pressure, and chronic kidney disease. Some animal research suggests the epidemic may in part be triggered by sugar. Sugar contains glucose and fructose, and studies suggest it is the fructose component that may have a role in chronic disease development. Animal studies indicate that fructose is distinct from other sugars by its ability to cause transient adenosine triphosphate (ATP) depletion in the cell with uric acid generation. The administration of fructose, or the raising of uric acid, can induce kidney disease and accelerate established kidney disease in animals. Therefore, we believe that the greatest risk from sugar is when it is given as a soft drink, as the rapidity of ingestion relates directly to the concentration of fructose that the cells are exposed to and hence govern the degree of ATP depletion and uric acid generation. Restricting sugar-sweetened beverages may be one strategy to combat obesity, diabetes, high blood pressure, and kidney disease, but human intervention studies are needed to support the theory.

 

Hydration: What Is Known and What Is Unknown

Rosenbloom, C. PhD RD, Nutrition Today, July/August 2012

In a 2007 article, Lawrence Armstrong of the Human Performance Institute at the University of Connecticut remarked that, when it came to assessing hydration status, we were still searching for the “elusive gold standard.” Hydration is one of those topics, like the weather, that everyone talks about but no one can make accurate predictions about. This seems to hold true for water in the form of rain… and also the water in our bodies that hydrates us. Nevertheless, athletes want to know exactly how much water they need under various exercise conditions, parents want to know if certain sugar-containing beverages will make their children obese, clinicians want to know how hydration affects chronic disease risk, and everyone wants to know if they should be carrying around a water bottle all day long! But all we can often do as nutrition scientists and exercise physiologists is to give general responses because the topic remains so “elusive.”

 

Hydration biomarkers and dietary fluid consumption of women

Armstrong, L., et. al. Journal of the Academy of Nutrition and Dietetics, July 2012

Normative values and confidence intervals for the hydration indices of women do not exist. Also, few publications have precisely described the fluid types and volumes that women consume. This investigation computed seven numerical reference categories for widely used hydration biomarkers (eg, serum and urine osmolality) and the dietary fluid preferences of self-reported healthy, active women. Participants (n=32; age 20±1 years; body mass 59.6±8.5 kg; body mass index [calculated as kg/m(2)] 21.1±2.4) were counseled in the methods to record daily food and fluid intake on 2 consecutive days. To reduce day-to-day body water fluctuations, participants were tested only during the placebo phase of the oral contraceptive pill pack. Euhydration was represented by the following ranges: serum osmolality=293 to 294 mOsm/kg; mean 24-hour total fluid intake=2,109 to 2,506 mL/24 hours; mean 24-hour total beverage intake=1,300 to 1,831 mL/24 hours; urine volume=951 to 1,239 mL/24 hours; urine specific gravity=1.016 to 1.020; urine osmolality=549 to 705 mOsm/kg; and urine color=5. However, only 3% of women experienced a urine specific gravity <1.005, and only 6% exhibited a urine color of 1 or 2. Water (representing 45.3% and 47.9% of 24-hour total fluid intake), tea, milk, coffee, and fruit juice were consumed in largest volumes. In conclusion, these data provide objective normative values for hyperhydration, euhydration, and dehydration that can be used by registered dietitians and clinicians to counsel women about their hydration status.

Observations of drinking water access in school food service areas before implementation of federal and state school water policy, California, 2011

Patel, A., et al., Preventing Chronic Disease, July 2012

Recent legislation requires schools to provide free drinking water in food service areas (FSAs). Our objective was to describe access to water at baseline and student water intake in school FSAs and to examine barriers to and strategies for implementation of drinking water requirements. We randomly sampled 24 California Bay Area public schools. We interviewed 1 administrator per school to assess knowledge of water legislation and barriers to and ideas for policy implementation. We observed water access and students’ intake of free water in school FSAs. Wellness policies were examined for language about water in FSAs. We found that fourteen of 24 schools offered free water in FSAs; 10 offered water via fountains, and 4 provided water through a nonfountain source. Four percent of students drank free water at lunch; intake at elementary schools (11%) was higher than at middle or junior high schools (6%) and high schools (1%). In secondary schools when water was provided by a nonfountain source, the percentage of students who drank free water doubled. Barriers to implementation of water requirements included lack of knowledge of legislation, cost, and other pressing academic concerns. No wellness policies included language about water in FSAs. We concluded that approximately half of schools offered free water in FSAs before implementation of drinking water requirements, and most met requirements through a fountain. Only 1 in 25 students drank free water in FSAs. Although schools can meet regulations through installation of fountains, more appealing water delivery systems may be necessary to increase students’ water intake at mealtimes.

French children start their school day with a hydration deficit

Bonnet, F., et al., Annals of Nutrition & Metabolism, June 2012

Fluid requirements of children vary as a function of gender and age. To our knowledge, there is very little literature on the hydration status of French children. We assessed the morning hydration status in a large sample of 529 French schoolchildren aged 9–11 years. Methods: Recruited children completed a questionnaire on fluid and food intake at breakfast and collected a urine sample the very same day after breakfast. Breakfast food and fluid nutritional composition was analyzed and urine osmolality was measured using a cryoscopic osmometer. More than a third of the children had a urine osmolality between 801 and 1,000 mosm/kg while 22.7% had a urine osmolality over 1,000 mosm/kg. This was more frequent in boys than in girls (p ! 0.001). A majority of children (73.5%) drank less than 400 ml at breakfast. Total water intake at breakfast was significantly and inversely correlated with high osmolality values. It was concluded that almost two thirds of the children in this large cohort had evidence of a hydration deficit when they went to school in the morning, despite breakfast intake. Children’s fluid intake at breakfast does not suffice to maintain an adequate hydration status for the whole morning.

Lead (Pb) in Tap Water and in Blood: Implications for Lead Exposure in the United States (PDF Download Available)

Triantafyllidou, S. and Edwards, M., Critical Reviews in Environmental Science and Technology, June 2012

Lead is widely recognized as one of the most pervasive environmental health threats in the United States, and there is increased concern over adverse health impacts at levels of exposure once considered safe. Lead contamination of tap water was once a major cause of lead exposure in the United States and, as other sources have been addressed, the relative contribution of lead in water to lead in blood is expected to become increasingly important. Moreover, prior research suggests that lead in water may be more important as a source than is presently believed. The authors describe sources of lead in tap water, chemical forms of the lead, and relevant U.S. regulations/guidelines, while considering their implications for human exposure. Research that examined associations between water lead levels and blood lead levels is critically reviewed, and some of the challenges in making such associations, even if lead in water is the dominant source of lead in blood, are highlighted. Better protecting populations at risk from this and from other lead sources is necessary, if the United States is to achieve its goal of eliminating elevated blood lead levels in children by 2020.

Protective Nutrients: Are They Here to Stay?

Walker, W.A. MD, Heintz, K. MS, Nutrition Today, May/June 2012

Protective nutrients benefit health in various ways beyond their conventionally established nutrient function such as by enhancing immune function, promoting gastrointestinal integrity, impacting metabolism, and preventing disease. Certain of these key nutrients have taken center stage as emerging research is showing that they can play a significant role throughout the life span. Study of an infants’ first natural nutrition, breast milk, has led to an improved understanding of how different compounds can beneficially effect physiological processes and act as protective nutrients. Probiotics, or “healthy bacteria,” are living microorganisms that confer a benefit when consumed in sufficient quantities. For example, certain strains help maintain the balance of the intestinal microbiota, a complex ecosystem that can be influenced by many factors such as stress, antibiotics, and diet. Research suggests that, when the intestinal microbiota is unbalanced, overall health may be affected. Prebiotics are nondigestible carbohydrates that can be used as an energy source by certain probiotics, thereby helping them grow and flourish to further promote a healthy ecosystem. Additional nutrients such as choline, vitamin D, and omega-3 fatty acids have also gained attention as being protective beyond normal growth and development, possessing functional effects that may be vital to future recommendations for health.

 

Screening-Level Risk Assessment of Coxiella burnetii (Q Fever) Transmission via Aeration of Drinking Water

Sales-Ortells, H., Medema, G., Environmental Science and Technology, April 2012

A screening-level risk assessment of Q fever transmission through drinking water produced from groundwater in the vicinity of infected goat barnyards that employed aeration of the water was performed. Quantitative data from scientific literature were collected and a Quantitative Microbial Risk Assessment approach was followed. An exposure model was developed to calculate the dose to which consumers of aerated groundwater are exposed through aerosols inhalation during showering. The exposure assessment and hazard characterization were integrated in a screening-level risk characterization using a dose-response model for inhalation to determine the risk of Q fever through tap water. A nominal range sensitivity analysis was performed. The estimated risk of disease was lower than 10(-4) per person per year (pppy), hence the risk of transmission of C. burnetii through inhalation of drinking water aerosols is very low. The sensitivity analysis shows that the most uncertain parameters are the aeration process, the transport of C. burnetii in bioaerosols via the air, the aerosolization of C. burnetii in the shower, and the air filtration efficiency. The risk was compared to direct airborne exposure of persons in the vicinity of infected goat farms; the relative risk of exposure through inhalation of drinking water aerosols was 0.002%.

Waterborne Pathogens: Emerging Issues in Monitoring, Treatment and Control

Reynolds, K.A., MSPH, Ph.D., Water Conditioning & Purification, March 2012

Microbial threats to water quality continue to emerge; however, technologies for monitoring, treating and controlling emerging waterborne pathogens are also evolving. Understanding the range of factors that lead to the contamination of water are important for developing appropriate tools to manage human health risks.

Health Risks of Limited-Contact Water Recreation

Dorevitch, S., et al., Environmental Health Perspectives, February 2012

Wastewater-impacted waters that do not support swimming are often used for boating, canoeing, fishing, kayaking, and rowing. Little is known about the health risks of these limited-contact water recreation activities. We evaluated the incidence of illness, severity of illness, associations between water exposure and illness, and risk of illness attributable to limited-contact water recreation on waters dominated by wastewater effluent and on waters approved for general use recreation (such as swimming). The Chicago Health, Environmental Exposure, and Recreation Study was a prospective cohort study that evaluated five health outcomes among three groups of people: those who engaged in limited-contact water recreation on effluent-dominated waters, those who engaged in limited-contact recreation on general-use waters, and those who engaged in non–water recreation. Data analysis included survival analysis, logistic regression, and estimates of risk for counterfactual exposure scenarios using G-computation. Telephone follow-up data were available for 11,297 participants. With non–water recreation as the reference group, we found that limited-contact water recreation was associated with the development of acute gastrointestinal illness in the first 3 days after water recreation at both effluent-dominated waters [adjusted odds ratio (AOR) 1.46; 95% confidence interval (CI): 1.08, 1.96] and general-use waters (1.50; 95% CI: 1.09, 2.07). For every 1,000 recreators, 13.7 (95% CI: 3.1, 24.9) and 15.1 (95% CI: 2.6, 25.7) cases of gastrointestinal illness were attributable to limited-contact recreation at effluent-dominated waters and general-use waters, respectively. Eye symptoms were associated with use of effluent-dominated waters only (AOR 1.50; 95% CI: 1.10, 2.06). Among water recreators, our results indicate that illness was associated with the amount of water exposure. Limited-contact recreation, both on effluent-dominated waters and on waters designated for general use, was associated with an elevated risk of gastrointestinal illness.

Planning for Sustainability: A Handbook for Water and Wastewater Utilities

U.S. Environmental Protection Agency, February 2012

This handbook is intended to provide information about how to enhance current planning processes by building in sustainability considerations. It is designed to be useful for various types and scales of planning efforts, such as: Long-range integrated water resource planning, Strategic planning, Capital planning, System-wide planning to meet regulatory requirements (e.g., combined sewer overflow upgrades and new stormwater permitting requirements), Specific infrastructure project planning (e.g., for repair, rehabilitation, or replacement of specific infrastructure)

Replacing caloric beverages with water or diet beverages for weight loss in adults: main results of the Choose Healthy Options Consciously Everyday (CHOICE) randomized clinical trial

Tate, D.F., et al., The American Journal of Clinical Nutrition, February 2012

Replacement of caloric beverages with noncaloric beverages may be a simple strategy for promoting modest weight reduction; however, the effectiveness of this strategy is not known. We compared the replacement of caloric beverages with water or diet beverages (DBs) as a method of weight loss over 6 mo in adults and attention controls (ACs). Results: In an intent-to-treat analysis, a significant reduction in weight and waist circumference and an improvement in systolic blood pressure were observed from 0 to 6 mo. Mean (±SEM) weight losses at 6 mo were −2.5 ± 0.45% in the DB group, −2.03 ± 0.40% in the Water group, and −1.76 ± 0.35% in the AC group; there were no significant differences between groups. The chance of achieving a 5% weight loss at 6 mo was greater in the DB group than in the AC group (OR: 2.29; 95% CI: 1.05, 5.01; P = 0.04). A significant reduction in fasting glucose at 6 mo (P = 0.019) and improved hydration at 3 (P = 0.0017) and 6 (P = 0.049) mo was observed in the Water group relative to the AC group. In a combined analysis, participants assigned to beverage replacement were 2 times as likely to have achieved a 5% weight loss (OR: 2.07; 95% CI: 1.02, 4.22; P = 0.04) than were the AC participants. Conclusions: Replacement of caloric beverages with noncaloric beverages as a weight-loss strategy resulted in average weight losses of 2% to 2.5%. This strategy could have public health significance and is a simple, straightforward message. This trial was registered at clinicaltrials.gov as NCT01017783.

Mild Dehydration Affects Mood in Healthy Young Women

Armstrong, L., et al., The Journal of Nutrition, February 2012

Limited information is available regarding the effects of mild dehydration on cognitive function. Therefore, mild dehydration was produced by intermittent moderate exercise without hyperthermia and its effects on cognitive function of women were investigated. Twenty-five females (age 23.0 ± 0.6 y) participated in three 8-h, placebo-controlled experiments involving a different hydration state each day: exercise-induced dehydration with no diuretic (DN), exercise-induced dehydration plus diuretic (DD; furosemide, 40 mg), and euhydration (EU). Cognitive performance, mood, and symptoms of dehydration were assessed during each experiment, 3 times at rest and during each of 3 exercise sessions. The DN and DD trials in which a volunteer attained a ≥1% level of dehydration were pooled and compared to that volunteer’s equivalent EU trials. Mean dehydration achieved during these DN and DD trials was −1.36 ± 0.16% of body mass. Significant adverse effects of dehydration were present at rest and during exercise for vigor-activity, fatigue-inertia, and total mood disturbance scores of the Profile of Mood States and for task difficulty, concentration, and headache as assessed by questionnaire. Most aspects of cognitive performance were not affected by dehydration. Serum osmolality, a marker of hydration, was greater in the mean of the dehydrated trials in which a ≥1% level of dehydration was achieved (P = 0.006) compared to EU. In conclusion, degraded mood, increased perception of task difficulty, lower concentration, and headache symptoms resulted from 1.36% dehydration in females. Increased emphasis on optimal hydration is warranted, especially during and after moderate exercise.

Atrazine Exposure in Public Drinking Water and Preterm Birth

Rinsky, J.L., et al., Public Health Reports, January/February 2012

Approximately 13% of all births occur prior to 37 weeks gestation in the U.S. Some established risk factors exist for preterm birth, but the etiology remains largely unknown. Recent studies have suggested an association with environmental exposures. We examined the relationship between preterm birth and exposure to a commonly used herbicide, atrazine, in drinking water. We reviewed Kentucky birth certificate data for 2004-2006 to collect duration of pregnancy and other individual-level covariates. We assessed existing data sources for atrazine levels in public drinking water for the years 2000-2008, classifying maternal county of residence into three atrazine exposure groups. We used logistic regression to analyze the relationship between atrazine exposure and preterm birth, controlling for maternal age, race/ethnicity, education, smoking, and prenatal care. An increase in the odds of preterm birth was found for women residing in the counties included in the highest atrazine exposure group compared with women residing in counties in the lowest exposure group, while controlling for covariates. Analyses using the three exposure assessment approaches produced odds ratios ranging from 1.20 (95% confidence interval [CI] 1.14, 1.27) to 1.26 (95% CI 1.19, 1.32), for the highest compared with the lowest exposure group. Suboptimal characterization of environmental exposure and variables of interest limited the analytical options of this study. Still, our findings suggest a positive association between atrazine and preterm birth, and illustrate the need for an improved assessment of environmental exposures to accurately address this important public health issue.

Source Water Protection Vision and Roadmap

Water Research Foundation, January 2012

In 2007, a group of source water protection experts met, under the auspices of the Water Research Foundation and the Water Environment Research Foundation, to develop a research agenda that would ultimately provide information to help drinking water suppliers design and implement effective source water protection programs. A key result of that effort identified the need for a national vision and roadmap that would guide U.S. water utilities and supporting groups with a unified strategy for coherent, consistent, cost-effective, and socially-acceptable source water protection programs. This brief document presents the vision and roadmap and focuses on how to move forward on source water protection. The roadmap is intended to serve as a feasible, focused path toward promoting source water protection for U.S. drinking water utilities. It is not intended to serve as an official directive, but rather is a collection of observations and recommendations organized to form a path to achieving the vision. The companion document Developing a Vision and Roadmap for Drinking Water Source Protection comprehensively covers the project team’s findings regarding the various building blocks to make source water protection a reality. That document includes an annotated bibliography of source water protection resources, a summation of a literature review, and helpful water utility case studies. Both documents are meant to be used in concert to help water utilities move forward with their source water protection efforts and proactively improve and/or maintain the quality of their drinking water sources. Source water protection has been discussed and promoted in an ad hoc fashion by different organizations at the national, regional, state, and local levels. It is essential to increase the awareness of source water protection at the national level. Education of decision makers, utility managers, stakeholders, and the general public should be the first step in moving source water protection up a path to success. Leadership is needed to make this a national priority. In order to ensure the various actions recommended in the roadmap can be carried out, it is recommended that both a top-down and a bottom-up approach be taken. A top-down approach would establish a flexible framework to guide local entities (e.g., water systems, watershed organizations, and regional planning agencies) to work together to protect source water. Due to the variability of source waters and the areas from which they are derived, along with technical, social, political, financial, and regulatory differences across jurisdictions, it is unlikely that two source water protection programs would be the same. A bottom-up approach is therefore also needed, which would use local information and broad stakeholder involvement to produce a “tailored” source water protection program that addresses unique issues at the local level.

Migration of Bisphenol-A into the Natural Spring Water Packaged in Polycarbonate Carboys

Erdem, Y.K., Furkan, A., International Journal of Applied Science and Technology, January 2012

Bisphenol-A is a widely used chemical in the structure of epoxy resins, polycarbonate packages, lacquer of metal food packages all over the world. Its weak estrogenic character and possible health effects are well known. For this reason the usage of the Bisphenol-A in food packages is limited and it’s daily intake by human is restrictly under control. The declaration of specific migration limit is 0.6 ppm, the tolerable daily intake is 0.05mg/kg body weight per day by EFSA and other authorities. The EFSA and others prevent the manufacturing and using of Bisphenol-A in baby bottles in 2010. In Turkey, the 70% of the population are living in 5 metropolitan cities and the drinking water consumption is mostly supplied by packaged drinking water industry. The household and bulk usage is covered by natural spring and natural mineral water packaged in 19 liters polycarbonate carboys. That’s why the possible migration of Bisphenol-A in drinking water packaged in polycarbonate carboys was decided to investigate. First of all, a screening test was carried out in the samples supplied by two main cities. And then 5 different trade mark packaged water samples was stored at 4, 25, and 35oC for 60 days and Bisphenol-A content was determined in given intervals. It is found that the BPA migration was detected at least 450 times lower than the specific migration limit of EFSA during 60 days storage at these conditions.

What is the cell hydration status of healthy children in the USA? Preliminary data on urine osmolality and water intake

Stookey, J.D., Brass, B., Holliday, A, Arieff, A., Public Health Nutrition, January 2012

Hyperosmotic stress on cells limits many aspects of cell function, metabolism and health. International data suggest that schoolchildren may be at risk of hyperosmotic stress on cells because of suboptimal water intake. The present study explored the cell hydration status of two samples of children in the USA. Elevated urine osmolality (>800 mmol/kg) was observed in 63 % and 66 % of participants in LA and NYC, respectively. In multivariable-adjusted logistic regression models, elevated urine osmolality was associated with not reporting intake of drinking water in the morning (LA: OR = 2·1, 95 % CI 1·2, 3·5; NYC: OR = 1·8, 95 % CI 1·0, 3·5). Although over 90 % of both samples had breakfast before giving the urine sample, 75 % did not drink water. Research is warranted to confirm these results and pursue their potential health implications.

Bottled Water & Tap Water: Just the Facts

Drinking Water Research Foundation, October 2011

The information presented in this report supports the fact that drinking water, whether from the tap or a bottle, is generally safe, and that regulatory requirements for both tap water and bottled water provide Americans with clean, safe drinking water. There are some differences in regulations for each, but those differences highlight the differences between drinking water delivered by a public water system and drinking water delivered to the consumer in a sealed container. Perhaps the most notable difference between tap water and bottled water is the method of delivery. Community water systems deliver water to consumers (businesses and private residences) through miles of underground iron (unlined and poly-lined), PVC, and lead service lines that can be subject to leakage with age of the system and accidental failures, resulting in the risk of post-treatment contamination of the water that is delivered to consumers. Bottled water is delivered to consumers in sanitary, sealed containers that were filled in a bottling facility under controlled conditions in a fill room.

The effects of water shortages on health and human development

Tarrass, F., and Benjelloun, M., Perspectives in Public Health, April 2011

Shortages of water could become a major obstacle to public health and development. Currently, the United Nations Children’s Fund (UNICEF) and the World Health Organization (WHO) estimate that 1.1 billion people lack access to a water supply and 2.6 billion people lack adequate sanitation. The global health burden associated with these conditions is staggering, with an estimated 1.6 million deaths every year from diseases associated with lack of access to safe drinking water, inadequate sanitation and poor hygiene. In this paper we review the impact of water shortages on health and human development.

Water, Water Everywhere… But, How Much Water Do We Really Need for Optimal Health and Wellness?

Rosenbloom, C. PhD, RD, CSSD, Nutrition Today, November/December 2010

Water, taken in moderation, cannot hurt anybody” is a quote attributed to US author and humorist, Mark Twain. The series of articles in this special issue suggest that water in moderation may not be enough for optimal health and wellness, and the authors push the boundaries of what is currently known about water in maintaining health and preventing disease. The Hydration for Health Conference, sponsored by Danone Waters, brought together international experts to review what is known about water consumption and health or, more appropriately, what is not known about water intake and health. One of the key drivers of the interest in water and health appears to be the global obesity epidemic affecting developed and developing countries, young and old alike, but there are other health problems that might be reduced or eliminated if optimal water consumption was known and practiced.

The Mexican Experience: From Public Health Concern Toward National Beverage Guidelines

Barquera, S. MD, PhD, Nutrition Today, November/December 2010

The paper describes the process experienced in Mexico from the characterizations of beverage consumption to the development of national beverage recommendation guidelines. Mexico is one of the countries with the highest prevalence of obesity in the world. Depending on the information source, it is often ranked as second after the United States. In addition, Mexicans are the second greatest consumers of soft drinks in the world. Currently, there is some ecological evidence that associates the trends in soft-drink consumption and overall diet with the increase in the prevalence of obesity.

Healthy Hydration for Physical Activity

Péronnet, F. PhD, Nutrition Today, November/December 2010

Water is the first ingredient of life. In the comfortable environment in which we live, with an ample supply of water, we forget that our ancestors lived in an environment where water was scarce, and the weather was hot. We therefore developed a very powerful cooling system in which water plays a major role. The importance of this system is best illustrated when we are exposed to exercise and heat, separately and even more when both are combined. In these situations, the primary way to get rid of the heat generated or received from the environment is through the secretion and evaporation of sweat, which is mainly water. Thanks to this cooling system, we can sustain prolonged exposures to heat and we can work in the heat. However, if not properly replaced, fluid loss under the form of sweat results in dehydration. This reduces the ability to regulate body temperature as well as the ability to perform exercise. Under extreme circumstances, which fortunately are not often encountered, dehydration and the increase in body temperature can result in heat stroke, which could be fatal.

 

Understanding Fluid Consumption Patterns to Improve Healthy Hydration

Le Bellego, L. PhD, et al., Nutrition Today, November/December 2010

Water is quantitatively by far the No. 1 nutrient in our diet. Of course, this can vary, depending on the amount and the quality of food and drink one consumes, but approximately 50% of what we eat and drink every day is water (CIQUAL, Table CIQUAL 2008, composition nutritionnelle des aliments, 2008, Centre d’Information sur la Qualité des Aliments, http://www.afssa.fr/TableCIQUAL/; US Department of Agriculture, Agricultural Research Service, 2005, USDA National Nutrient Database for Standard Reference, Release 18. Nutrient Data Laboratory Home Page, http://www.nal.usda.gov/fnic/foodcomp; NUTTAB, 2006, Food Standards Australia New Zealand [FSANZ], http://www.foodstandards.gov.au/monitoringandsurveillance/nuttab2006/). It is also the No. 1 component of the human body by mass. This varies from one person to another, depending on individual characteristics such as body weight, ratio between lean and adipose tissues, and physiological state (pregnancy, etc), but approximately 60% of the adult body is composed of water. [Nutr Rev 2005;63(6 pt 2):S40-S54]. Finally, no biological reaction or function in the body would be possible without water. In other words, life is not possible without water. This makes the quantity and the quality of the fluids we have to drink every day quite an important issue both nutritionally and physiologically. From this perspective, it is interesting to discuss available recommendations for water intake and their reliability. This is very challenging, because no study is available on the long-term health effects of the quantity and/or the quality of fluids ingested.

 

Role of Sugar Intake in Beverages on Overweight and Health

Lafontan, M. PhD, Nutrition Today, November/December 2010

Epidemiological data have demonstrated an association between sugar intake in beverages and overweight. Cross-sectional studies are the most common but rather limited, and a lot of points are still a matter of debate. Results of intervention trials are more promising, although they remain quite rare; they provide the best arguments to infer causality. This overview is limited to the analysis of the putative impact of sugar inclusion in beverages on health, obesity, and diabetes risk. Mechanisms of action and physiological end points are highlighted to clarify the differences existing in the health impact of various kinds of sugars. When considering weight changes and obesity-related questions related to sugar-sweetened beverages consumption, it is important to take into account population differences and genetic parameters. Lifestyle influences (eg, other components of the diet and physical activity) must also be considered in the studies.

Hydration and Human Cognition

Lieberman, H.R. PhD, Nutrition Today, November/December 2010

Although adequate hydration is essential for optimal brain function, research addressing relationships between hydration status and human behavior and cognitive function is limited. The few published studies in this area are inconclusive and contradictory. The impact of variations in hydration status, which can be substantial as humans go about their daily activities, on brain function and behavior is not known and may impact quality of life. Furthermore, vulnerable populations such as children, elderly people, and individuals with illnesses may be at higher risk of degradation in cognitive function from dehydration. A variety of difficult methodological issues have impeded progress in this area. For example, there are several methods to achieve dehydration in humans, each with different strengths and weakness. Accurately assessing and modifying human hydration status and consistently achieving desired levels of dehydration in a controlled manner are problematic. It is difficult to select appropriate behavioral tasks that detect relatively subtle changes in cognitive performance and mood resulting from moderate levels of dehydration. Generating experimental designs that include hydrated control conditions and double-blind testing poses substantial challenges to investigators. Additional well-controlled research is essential if progress is to be made and understanding gained of the effects of dehydration on cognitive function. Key elements of research should include accurate methods of assessing and modifying hydration state, an adequate number of subjects, appropriate behavioral tasks to detect subtle effects of dehydration, and inclusion of rigorous control conditions.

Drinking Water and Weight Management

Stookey, J.D. PhD, Nutrition Today, November/December 2010

This review summarizes the evidence base for recommending drinking water for weight management. Crossover experiments consistently report that drinking water results in lower total energy intake when consumed instead of caloric beverages, because individuals do not eat less food to compensate for calories in beverages. Crossover experiments also consistently report that drinking water results in greater fat oxidation compared with other beverages, because drinking water does not stimulate insulin. In intervention studies, advice to drink water is associated with reduced weight gain in children and greater weight loss in dieting adults. Although gaps in knowledge remain about specific effects of drinking water on weight loss in children and obesity prevention in adults, there is a strong evidence base for recommending drinking water for weight management.

Water Physiology: Essentiality, Metabolism, and Health Implications

Kavouras, S.A. PhD, Anastasiou, C.A. PhD, Nutrition Today, November/December 2010

Water is the most abundant molecule in the human body that undergoes continuous recycling. Numerous functions have been recognized for body water, including its function as a solvent, as a means to remove metabolic heat, and as a regulator of cell volume and overall function. Tight control mechanisms have evolved for precise control of fluid balance, indicative of its biological importance. However, water is frequently overlooked as a nutrient. This article reviews the basic elements of water physiology in relation to health, placing emphasis on the assessment of water requirements and fluid balance. Current recommendations are also discussed.

Effects of Water Consumption on Kidney Function and Excretion

Tack, I., Nutrition Today, November/December 2010

Water homeostasis depends on fluid intake and maintenance of body water balance by adjustment of renal excretion under the control of arginine vasopressin hormone. The human kidney manages more efficiently fluid excess than fluid deficit. As a result, no overhydration is observed in healthy individuals drinking a large amount of fluid, whereas a mild hydration deficit is not uncommon in small-fluid-volume (SFV) drinkers. Small-fluid-volume intake does not alter renal function but is associated with an increased risk of renal lithiasis and urinary tract infection. In that case, increasing fluid intake prevents recurrence. The benefit of increasing fluid intake in healthy SFV drinkers had never been studied until now. Two recent studies from Danone Research indicate that increasing water intake in such people leads to a significant decrease of the risk of renal stone disease (assessed by measuring Tiselius’ crystallization risk index). Because renal lithiasis and urinary tract infection prevalence are quite high in western countries, this preliminary observation supports the interest of an approach based on primary prevention using voluntary increase in water-based fluid consumption in SFV drinkers. Complementary studies are required to determine other clinical impacts of SFV intake and to evaluate the benefits of increasing fluid intake.

 

Bromate reduction in simulated gastric juice

Cotruvo, J.A., et al., e-Journal AWWA, November 2010

This article advocates for a revised risk assessment for bromate to reflect presystemic chemistry not usually considered when low-dose risks are calculated from high-dose toxicology data. Because of high acidity and the presence of reducing agents, presystemic decomposition of bromate can begin in the stomach, which should contribute to lower-than- expected doses to target organs. In this research, bromate decomposition kinetics with simulated stomach/gastric juice were studied to determine the risk of environmentally relevant exposure to bromate. The current work is the first step in a series of studies that the authors are conducting to better estimate the hypothetical low-dose risks to humans from drinking water ingestion and thus arrive at more appropriate maximum contaminant levels (MCLs). It is the authors’ belief that additional kinetics and metabolism research will demonstrate that the human risk from ingestion of compounds in drinking water is less than originally believed and will lead to MCLs and MCL goals that are more scientifically based.

Drinking Water and Risk of Stroke

Gustavo Saposnik, MD, MSc, FAHA, Stroke, October 2010

In the present issue of Stroke, the authors investigate the association between low-level arsenic exposure in drinking water and the ischemic stroke admissions in Michigan. They found that even low exposure to arsenic is associated with an increased incident risk of stroke (relative risk, 1.03; 95% CI, 1.01 to 1.05 per µg/L increase in arsenic concentration). The authors also compared whether that exposure was associated with other nonvascular conditions (hernia, duodenal ulcer) not expected to increase their risk. Comparing zip codes in Genesee County at the 90th percentile of arsenic levels (21.6 µg/L) with those at the 10th percentile (0.30 µg/L), there was a 91% increase in risk of stroke admission (relative risk, 1.91; 95% CI, 1.27 to 2.88). The results were consistent in showing an increased risk for stroke, but not for other control medical conditions (hernia and duodenal ulcer). Moreover, they found a graded effect: a higher incident risk among those individuals exposed to higher water concentrations of arsenic (Figure 2).

Association between children’s blood lead levels, lead service lines, and water disinfection

Brown, M.J., Raymond, J., Homa, D., Kennedy, C., Sinks, T., Environmental Research, October 2010

Evaluate the effect of changes in the water disinfection process, and presence of lead service lines (LSLs), on children’s blood lead levels (BLLs) in Washington, DC. Three cross-sectional analyses examined the relationship of LSL and changes in water disinfectant with BLLs in children o6 years of age. The study population was derived from the DC Childhood Lead Poisoning Prevention Program blood lead surveillance system of children who were tested and whose blood lead test results were reported to the DC Health Department. The Washington, DC Water and Sewer Authority (WASA) provided information on LSLs. The final study population consisted of 63,854 children with validated addresses. Controlling for age of housing, LSL was an independent risk factor for BLLs Z10 mg/dL, and Z5 mg/dL even during time periods whenwater levelsmet theUS Environmental Protection Agency (EPA) action level of 15 parts per billion (ppb). When chloramine alone was used to disinfect water, the risk for BLL in the highest quartile among children in homes with LSL was greater than when either chlorine or chloramine with orthophosphate was used. For children tested after LSLs in their houses were replaced, those with partially replaced LSL were 43 times as likely to have BLLs Z10 mg/dL versus children who never had LSLs. LSLs were a risk factor for elevated BLLs even when WASA met the EPA water action level. Changes in water disinfection can enhance the effect of LSLs and increase lead exposure. Partially replacing LSLs may not decrease the risk of elevated BLLs associated with LSL exposure.

When is the Next Boil Water Alert?

Water Technology, August 2010

A common theme we see on a daily basis relates to drinking water infrastructure. We track news throughout the world that impacts the drinking water industry, and one of the most frequent things we see are notices from agencies and organizations about the need for communities to boil water in order to combat possible contamination. In some parts of the world, boiling water is the norm due to water supply issues. Often, these areas may be limited in their ability to develop economically, as clean water is such an integral part of daily life. It is in the developed world, however, where we have been seeing a large increase in the number of such notices.

Climate Change, Water, and Risk: Current Water Demands Are Not Sustainable

www.nrdc.org, July 2010

Climate change will have a significant impact on the sustainability of water supplies in the coming decades. A new analysis, performed by consulting firm Tetra Tech for the Natural Resources Defense Council (NRDC), examined the effects of global warming on water supply and demand in the contiguous United States. The study found that more than 1,100 counties— one-third of all counties in the lower 48—will face higher risks of water shortages by mid-century as the result of global warming. More than 400 of these counties will face extremely high risks of water shortages.

Scientific Opinion on Dietary Reference Values for water

European Food Safety Authority (EFSA), EFSA Journal, March 2010

This Opinion of the EFSA Panel on Dietetic Products, Nutrition, and Allergies (NDA) deals with the setting of dietary reference values for water for specific age groups. Adequate Intakes (AI) have been defined derived from a combination of observed intakes in population groups with desirable osmolarity values of urine and desirable water volumes per energy unit consumed. The reference values for total water intake include water from drinking water, beverages of all kind, and from food moisture and only apply to conditions of moderate environmental temperature and moderate physical activity levels (PAL 1.6). AIs for infants in the first half of the first year of life are estimated to be 100-190 mL/kg per day. For infants 6-12 months of age a total water intake of 800-1000 mL/day is considered adequate. For the second year of life an adequate total water intake of 1100-1200 mL/day is defined by interpolation, as intake data are not available. AIs of water for children are estimated to be 1300 mL/day for boys and girls 2-3 years of age; 1600 mL/day for boys and girls 4-8 years of age; 2100 mL/day for boys 9-13 years of age; 1900 mL/day for girls 9-13 years of age. Adolescents of 14 years and older are considered as adults with respect to adequate water intake. Available data for adults permit the definition of AIs as 2.0 L/day (P 95 3.1 L) for females and 2.5 L/day (P95 4.0 L) for males. The same AIs as for adults are defined for the elderly. For pregnant women the same water intake as in non-pregnant women plus an increase in proportion to the increase in energy intake (300 mL/day) is proposed. For lactating women adequate water intakes of about 700 mL/day above the AIs of non-lactating women of the same age are derived.

Water as an essential nutrient: the physiological basis of hydration

Jéquier, E. and Constant, F., EJCN – European Journal of Clinical Nutrition, September 2009

How much water we really need depends on water functions and the mechanisms of daily water balance regulation. The aim of this review is to describe the physiology of water balance and consequently to highlight the new recommendations with regard to water requirements. Water has numerous roles in the human body. It acts as a building material; as a solvent, reaction medium and reactant; as a carrier for nutrients and waste products; in thermoregulation; and as a lubricant and shock absorber. The regulation of water balance is very precise, as a loss of 1% of body water is usually compensated within 24 h. Both water intake and water losses are controlled to reach water balance. Minute changes in plasma osmolarity are the main factors that trigger these homeostatic mechanisms. Healthy adults regulate water balance with precision, but young infants and elderly people are at greater risk of dehydration. Dehydration can affect consciousness and can induce speech incoherence, extremity weakness, hypotonia of ocular globes, orthostatic hypotension and tachycardia. Human water requirements are not based on a minimal intake because it might lead to a water deficit due to numerous factors that modify water needs (climate, physical activity, diet and so on). Water needs are based on experimentally derived intake levels that are expected to meet the nutritional adequacy of a healthy population. The regulation of water balance is essential for the maintenance of health and life. On an average, a sedentary adult should drink 1.5 l of water per day, as water is the only liquid nutrient that is really essential for body hydration.

Water Disinfection By-Products and the Risk of Specific Birth Defects: A Population-Based Cross-Sectional Study in Taiwan

Hwang, B.-F., Jaakkola, J., Guo, H.-R., Environmental Health,  June 2008

Recent findings suggest that exposure to disinfection by-products may increase the risk of birth defects. Previous studies have focused mainly on birth defects in general or groups of defects. The objective of the present study was to assess the effect of water disinfection by-products on the risk of most common specific birth defects. We conducted a population-based cross-sectional study of 396,049 Taiwanese births in 2001-2003 using information from the Birth Registry and Waterworks Registry. We compared the risk of eleven most common specific defects in four disinfection by-product exposure categories based on the levels of total trihalomethanes (TTHMs) representing high (TTHMs 20+ ug/L), medium (TTHMs 10-19 ug/L), low exposure (TTHMs 5-9 ug/L), and 0-4 ug/L as the reference category. In addition, we conducted a meta-analysis of the results from the present and previous studies focusing on the same birth defects.

Maternal Exposure to Water Disinfection By-products During Gestation and Risk of Hypospadias

Luben, T.J., Nuckols, J.R., Mosley, B.S., Hobbs, C., Reif, J.S., Occupational and Environmental Medicine, June 2008

The use of chlorine for water disinfection results in the formation of numerous contaminants called disinfection by-products (DBPs), which may be associated with birth defects, including urinary tract defects. We used Arkansas birth records (1998-2002) to conduct a population-based case-control study investigating the relationship between hypospadias and two classes of DBPs, trihalomethanes (THM) and haloacetic acids (HAA). We utilised monitoring data, spline regression and geographical information systems (GIS) to link daily concentrations of these DBPs from 263 water utilities to 320 cases and 614 controls. We calculated ORs for hypospadias and exposure to DBPs between 6 and 16 weeks’ gestation, and conducted subset analyses for exposure from ingestion, and metrics incorporating consumption, showering and bathing. We found no increase in risk when women in the highest tertiles of exposure were compared to those in the lowest for any DBP. When ingestion alone was used to assess exposure among a subset of 40 cases and 243 controls, the intermediate tertiles of exposure to total THM and the five most common HAA had ORs of 2.11 (95% CI 0.89 to 5.00) and 2.45 (95% CI 1.06 to 5.67), respectively, compared to women with no exposure. When exposure to total THM from consumption, showering and bathing exposures was evaluated, we found an OR of 1.96 (95% CI 0.65 to 6.42) for the highest tertile of exposure and weak evidence of a dose-response relationship. Our results provide little evidence for a positive relationship between DBP exposure during gestation and an increased risk of hypospadias but emphasize the necessity of including individual-level data when assessing exposure to DBPs.

Formation of N-Nitrosamines from Eleven Disinfection Treatments of Seven Different Surface Waters

Zhao, Y.-Y., et al., Environmental Science & Technology, May 2008

Formation of nine N-nitrosamines has been investigated when seven different source waters representing various qualities were each treated with eleven bench-scale disinfection processes, without addition of nitrosamine precursors. These disinfection treatments included chlorine (OCl-) chloramine (NH2Cl), chlorine dioxide (ClO2), ozone (O3), ultraviolet (UV), advanced oxidation processes (AOP), and combinations. The total organic carbon (TOC) of the seven source waters ranged from 2 to 24 mg L-1. The disinfected water samples and the untreated source waters were analyzed for nine nitrosamines using a solid phase extraction and liquid chromatography-tandem mass spectrometry method. Prior to any treatment, N-nitrosodimethylamine (NDMA) was detected ranging from 0 to 53 ng L-1 in six of the seven source waters, and its concentrations increased in the disinfected water samples (0 – 118 ng L-1). N-nitrosodiethylamine (NDEA), N-nitrosomorpholine (NMor), and N-nitrosodiphenylamine (NDPhA) were also identified in some of the disinfected water samples. NDPhA (0.2- 0.6 ng L-1) was formed after disinfection with OCl-, NH2Cl, O3, and MPUV/OCl-. NMEA was produced with OCl- and MPUV/OCl-, and NMor formation was associated with O3. In addition, UV treatment alone degraded NDMA; however, UV/OCl- and AOP/OCl- treatments produced higher amounts of NDMA compared to UV and AOP alone, respectively. These results suggest that UV degradation or AOP oxidation treatment may provide a source of NDMA precursors. This study demonstrates that environmental concentrations and mixtures of unknown nitrosamine precursors in source waters can form NDMA and other nitrosamines.

N,N-Dimethylsulfamide as Precursor for N-Nitrosodimethylamine (NDMA) Formation upon Ozonation and its Fate During Drinking Water Treatment

Schmidt, C.K., Brauch, H.-J., Environmental Science & Technology, April 2008

Application and microbial degradation of the fungicide tolylfluanide gives rise to a new decomposition product named N,N-dimethylsulfamide (DMS). In Germany, DMS was found in groundwaters and surface waters with typical concentrations in the range of 100-1000 ng/L and 50-90 ng/L, respectively. Laboratory-scale and field investigations concerning its fate during drinking water treatment showed that DMS cannot be removed via riverbank filtration, activated carbon filtration, flocculation, and oxidation or disinfection procedures based on hydrogen peroxide, potassium permanganate, chlorine dioxide, or UV irradiation. Even nanofiltration does not provide a sufficient removal efficiency. During ozonation about 30-50% of DMS are converted to the carcinogenic N-nitrosodimethylamine (NDMA). The NDMA being formed is biodegradable and can at least partially be removed by subsequent biologically active drinking water treatment steps including sand or activated carbon filtration. Disinfection with hypochlorous acid converts DMS to so far unknown degradation products but not to NDMA or 1,1-dimethylhydrazine (UDMH).

 

Risk of Birth Defects in Australian Communities with High Brominated Disinfection By-product Levels

Chisholm, K., et al., Environmental Health Perspective, April 2008

By international standards, water supplies in Perth, Western Australia, contain high trihalomethane (THM) levels, particularly the brominated forms. Geographic variability in these levels provided an opportunity to examine cross-city spatial relationships between THM exposure and rates of birth defects (BDs).Our goal was to examine BD rates by exposure to THMs with a highly brominated fraction in metropolitan locations in Perth, Western Australia. We collected water samples from 47 separate locations and analyzed them for total and individual THM concentrations (micrograms per liter), including separation into brominated forms. We classified collection areas by total THM (TTHM) concentration: low (< 60 microg/L), medium (> 60 to < 130 microg/L), and high (> or = 130 microg/L). We also obtained deidentified registry-based data on total births and BDs (2000-2004 inclusive) from post codes corresponding to water sample collection sites and used binomial logistic regression to compare the frequency of BDs aggregately and separately for the TTHM exposure groups, adjusting for maternal age and socioeconomic status. Total THMs ranged from 36 to 190 microg/L. A high proportion of the THMs were brominated (on average, 92%). Women living in high-TTHM areas showed an increased risk of any BD [odds ratio (OR) = 1.22; 95% confidence interval (CI), 1.01-1.48] and for the major category of any cardiovascular BD (OR = 1.62; 95% CI, 1.04-2.51), compared with women living in low-TTHM areas. Brominated forms constituted the significant fraction of THMs in all areas. Small but statistically significant increases in risks of BDs were associated with residence in areas with high THMs.

EPA – FACTOIDS: Drinking Water and Ground Water Statistics for 2007

U.S. Environmental Protection Agency, March, 2008

There are approximately 156,000 public drinking water systems in the United States. Each of these systems regularly supplies drinking water to at least 25 people or 15 service connections. Beyond their common purpose, the 156,000 systems vary widely. The following tables group water systems into categories that show their similarities and differences. For example, the first table shows that most people in the US (286 million) get their water from a community water system. There are approximately 52,000 community water systems, but just eight percent of those systems (4,048) serve 82 percent of the people. The second table shows that more water systems have groundwater than surface water as a source–but more people drink from a surface water system. Other tables break down these national numbers by state, territory, and EPA region.

This package also contains figures on the types and locations of underground injection control wells. EPA and states regulate the placement and operation of these wells to ensure that they do not threaten underground sources of drinking water. The underground injection control program statistics are based on separate reporting from the states to EPA. The drinking water system statistics on the following pages are taken from the Safe Drinking Water Information System/Federal version (SDWIS/Fed). SDWIS/Fed is the U.S. Environmental Protection Agency’s official record of public drinking water systems, their violations of state and EPA regulations, and enforcement actions taken by EPA or states as a result of those violations. EPA maintains the database using information collected and submitted by the states. Notice: Compliance statistics are based on violations reported by states to the EPA Safe Drinking Water Information System. EPA is aware of inaccuracies and underreporting of some data in this system. We are working with the states to improve the quality of the data. Read an analysis of SDWIS/Fed data quality and get more information and additional drinking water data tables.

Human Health Risk Assessment of Chlorinated Disinfection By-products in Drinking Water Using a Probabilistic Approach

Hamidin, N., Yu, Q.J., Connell, D.W., Water Research, March 2008

The presence of chlorinated disinfection by-products (DBPs) in drinking water is a public health issue, due to their possible adverse health effects on humans. To gauge the risk of chlorinated DBPs on human health, a risk assessment of chloroform (trichloromethane (TCM)), bromodichloromethane (BDCM), dibromochloromethane (DBCM), bromoform (tribromomethane (TBM)), dichloroacetic acid (DCAA) and trichloroacetic acid (TCAA) in drinking water was carried out using probabilistic techniques. Literature data on exposure concentrations from more than 15 different countries and adverse health effects on test animals as well as human epidemiological studies were used. The risk assessment showed no overlap between the highest human exposure dose (EXP(D)) and the lowest human equivalent dose (HED) from animal test data, for TCM, BDCM, DBCM, TBM, DCAA and TCAA. All the HED values were approximately 10(4)-10(5) times higher than the 95th percentiles of EXP(D). However, from the human epidemiology data, there was a positive overlap between the highest EXP(D) and the lifetime average daily doses (LADD(H)) for TCM, BDCM, DCAA and TCAA. This suggests that there are possible adverse health risks such as a small increased incidence of cancers in males and developmental effects on infants. However, the epidemiological data comprised several risk factors and exposure classification levels which may affect the overall results.

Drinking Water Disinfection By-Products and Time to Pregnancy

Maclehose, R.F., Savitz, D.A., Herring, A.H., Hartmann, K.E., Singer, P.C., Weinberg, H.S., Epidemiology, March 2008

Laboratory evidence suggests tap water disinfection by-products (DBPs) could have an effect very early in pregnancy, typically before clinical detectability. Undetected early losses would be expected to increase the reported number of cycles to clinical pregnancy. We investigated the association between specific DBPs (trihalomethanes, haloacetic acids, brominated-trihalomethanes, brominated-haloacetic acids, total organic halides, and bromodichloromethane) and time to pregnancy among women who enrolled in a study of drinking water and reproductive outcomes. We quantified exposure to DBPs through concentrations in tap water, quantity ingested through drinking, quantity inhaled or absorbed while showering or bathing, and total integrated exposure. The effect of DBPs on time to pregnancy was estimated using a discrete time hazard model. Overall, we found no evidence of an increased time to pregnancy among women who were exposed to higher levels of DBPs. A modestly decreased time to pregnancy (ie, increased fecundability) was seen among those exposed to the highest level of ingested DBPs, but not for tap water concentration, the amount absorbed while showering or bathing, or the integrated exposure. Our findings extend those of a recently published study suggesting a lack of association between DBPs and pregnancy loss.

Risk of waterborne illness via drinking water in the United States

Reynolds, K.A., Mena, K.D., Gerba, C.P., Reviews of Environmental Contamination & Toxicology, January 2008

Outbreaks of disease attributable to drinking water are not common in the U.S., but they do still occur and can lead to serious acute, chronic, or sometimes fatal health consequences, particularly in sensitive and immunocompromised populations. From 1971 to 2002, there were 764 documented waterborne outbreaks associated with drinking water, resulting in 575,457 cases of illness and 79 deaths (Blackburn et al. 2004; Calderon 2004); however, the true impact of disease is estimated to be much higher. If properly applied, current protocols in municipal water treatment are effective at eliminating pathogens from water. However, inadequate, interrupted, or intermittent treatment has repeatedly been associated with waterborne disease outbreaks. Contamination is not evenly distributed but rather affected by the number of pathogens in the source water, the age of the distribution system, the quality of the delivered water, and climatic events that can tax treatment plant operations. Private water supplies are not regulated by the USEPA and are generally not treated or monitored, although very few of the municipal systems involved in documented outbreaks exceeded the USEPA’s total coliform standard in the preceding 12 mon (Craun et al. 2002). We provide here estimates of waterborne infection and illness risks in the U.S. based on the total number of water systems, source water type, and total populations exposed. Furthermore, we evaluated all possible illnesses associated with the microbial infection and not just gastroenteritis. Our results indicate that 10.7 M infections/yr and 5.4 M illnesses/yr occur in populations served by community groundwater systems; 2.2 M infections/yr and 1.1 M illnesses/yr occur in noncommunity groundwater systems; and 26.0 M infections/yr and 13.0 M illnesses/yr occur in municipal surface water systems. The total estimated number of waterborne illnesses/yr in the U.S. is therefore estimated to be 19.5 M/yr. Others have recently estimated waterborne illness rates of 12M cases/yr (Colford et al. 2006) and 16 M cases/yr (Messner et al. 2006), yet our estimate considers all health outcomes associated with exposure to pathogens in drinking water rather than only gastrointestinal illness. Drinking water outbreaks exemplify known breaches in municipal water treatment and distribution processes and the failure of regulatory requirements to ensure water that is free of human pathogens. Water purification technologies applied at the point-of-use (POU) can be effective for limiting the effects of source water contamination, treatment plant inadequacies, minor intrusions in the distribution system, or deliberate posttreatment acts (i.e., bioterrorism). Epidemiological studies are conflicting on the benefits of POU water treatment. One prospective intervention study found that consumers of reverse-osmosis (POU) filtered water had 20%-35% less gastrointestinal illnesses than those consuming regular tap water, with an excess of 14% of illness due to contaminants introduced in the distribution system (Payment 1991, 1997). Two other studies using randomized, blinded, controlled trials determined that the risks were equal among groups supplied with POU-treated water compared to untreated tap water (Hellard et al. 2001; Colford et al. 2003). For immunocompromised populations, POU water treatment devices are recommended by the CDC and USEPA as one treatment option for reducing risks of Cryptosporidium and other types of infectious agents transmitted by drinking water. Other populations, including those experiencing “normal” life stages such as pregnancy, or those very young or very old, might also benefit from the utilization of additional water treatment options beyond the current multibarrier approach of municipal water treatment.

Massive Microbiological Groundwater Contamination Associated with a Waterborne Outbreak in Lake Erie, South Bass Island, Ohio

Fong, T.-T., et al., Environmental Health Perspectives, June 2007

A groundwater-associated outbreak affected approximately 1,450 residents and visitors of South Bass Island, Ohio, between July and September 2004. To examine the microbiological quality of groundwater wells located on South Bass Island, we sampled 16 wells that provide potable water to public water systems 15–21 September 2004. We tested groundwater wells for fecal indicators, enteric viruses and bacteria, and protozoa (Cryptosporidium and Giardia). The hydrodynamics of Lake Erie were examined to explore the possible surface water–groundwater interactions. All wells were positive for both total coliform and Escherichia coli. Seven wells tested positive for enterococci and Arcobacter (an emerging bacterial pathogen), and F+-specific coliphage was present in four wells. Three wells were positive for all three bacterial indicators, coliphages, and Arcobacter; adenovirus DNA was recovered from two of these wells. We found a cluster of the most contaminated wells at the southeast side of the island. Conclusions: Massive groundwater contamination on the island was likely caused by transport of microbiological contaminants from wastewater treatment facilities and septic tanks to the lake and the subsurface, after extreme precipitation events in May–July 2004. This likely raised the water table, saturated the subsurface, and along with very strong Lake Erie currents on 24 July, forced a surge in water levels and rapid surface water–groundwater interchange throughout the island. Landsat images showed massive influx of organic material and turbidity surrounding the island before the peak of the outbreak. These combinations of factors and information can be used to examine vulnerabilities in other coastal systems. Both wastewater and drinking water issues are now being addressed by the Ohio Environmental Protection Agency and the Ohio Department of Health.

Analysis of Compliance and Characterization of Violations of the Total Coliform Rule

U.S. Environmental Protection Agency, April 2007

Total coliforms have long been used in drinking water regulations as an indicator of the adequacy of water treatment and the integrity of the distribution system. Total coliforms are a group of closely related bacteria that are generally harmless. In drinking water systems, total coliforms react to treatment in a manner similar to most bacterial pathogens and many viral pathogens. Thus, the presence of total coliforms in the distribution system can indicate that the system in also vulnerable to the presence of pathogens in the system. (EPA, June 2001, page 7) Total coliforms are the indicators used in the existing Total Coliform Rule (TCR). EPA is undertaking “a rulemaking process to initiate possible revisions to the TCR. As part of this process, EPA believes it may be appropriate to include this rulemaking in a wider effort to review and address broader issues associated with drinking water distribution systems.” (see Federal Register 68 FR 19030 and 68 FR 42907). Since the promulgation of the TCR, EPA has received stakeholder feedback suggesting modifications to the TCR to reduce the implementation burden. The purpose of this paper is to provide information on the number and frequency of violations of the TCR and to further characterize the frequency with which different types and sizes of systems incur violations. Although EPA explores some statistical testing in this paper, the paper concentrates on presenting the data, as it is, in SDWIS/FED. Information on these frequencies will be useful in supporting several EPA initiatives, particularly the effort to review and possibly revise the TCR. This paper has been undertaken as part of the review of the TCR.

 

Water Quality Control in Premise Plumbing

Reynolds, K.A., Water Conditioning and Purification, February 2007

The quality of water at the end use is impacted by numerous and varied factors including source water type and quality, age of the distribution system, climatic events and even consumer use patterns. Therefore, providing high-quality drinking water at the tap requires a multi-barrier approach aimed at source water protection, source water treatment and reliable distribution. Each of these steps is monitored and controlled by municipal water treatment standards and guidelines; however, what happens to the water quality beyond the service connection at individual sites is not as well understood. New reports of water quality deterioration in the plumbing of residential or commercial buildings, known as premise plumbing, pose a question: Just what is present in our pipes?

 

Drowning in Disinfection Byproducts? Assessing Swimming Pool Water

DeMarini, D.M., et al., Environmental Science & Technology, January 2007

Disinfection is mandatory for swimming pools: public pools are usually disinfected by gaseous chlorine or sodium hypochlorite and cartridge filters; home pools typically use stabilized chlorine. These methods produce a variety of disinfection byproducts (DBPs), such as trihalomethanes (THMs), which are regulated carcinogenic DBPs in drinking water that have been detected in the blood and breath of swimmers and of nonswimmers at indoor pools. Also produced are halogenated acetic acids (HAAs) and haloketones, which irritate the eyes, skin, and mucous membranes; trichloramine, which is linked with swimming-pool-associated asthma; and halogenated derivatives of UV sun screens, some of which show endocrine effects. Precursors of DBPs include human body substances, chemicals used in cosmetics and sun screens, and natural organic matter. Analytical research has focused also on the identification of an additional portion of unknown DBPs using gas chromatography (GC)/mass spectrometry (MS) and liquid chromatography (LC)/MS/MS with derivatization. Children swimmers have an increased risk of developing asthma and infections of the respiratory tract and ear. A 1.6-2.0-fold increased risk for bladder cancer has been associated with swimming or showering/bathing with chlorinated water. Bladder cancer risk from THM exposure (all routes combined) was greatest among those with the GSTT1-1 gene. This suggests a mechanism involving distribution of THMs to the bladder by dermal/inhalation exposure and activation there by GSTT1-1 to mutagens. DBPs may be reduced by engineering and behavioral means, such as applying new oxidation and filtration methods, reducing bromide and iodide in the source water, increasing air circulation in indoor pools, and assuring the cleanliness of swimmers. The positive health effects gained by swimming can be increased by reducing the potential adverse health risks.

An approach for developing a national estimate of waterborne disease due to drinking water and a national estimate model application

Messner, M., et al., Journal of Water and Health,  04.suppl 2, July 2006

In this paper, the US Environmental Protection Agency (EPA) presents an approach and a national estimate of drinking water related endemic acute gastrointestinal illness (AGI) that uses information from epidemiologic studies. There have been a limited number of epidemiologic studies that have measured waterborne disease occurrence in the United States. For this analysis, we assume that certain unknown incidence of AGI in each public drinking water system is due to drinking water and that a statistical distribution of the different incidence rates for the population served by each system can be estimated to inform a mean national estimate of AGI illness due to drinking water. Data from public water systems suggest that the incidence rate of AGI due to drinking water may vary by several orders of magnitude. In addition, data from epidemiologic studies show AGI incidence due to drinking water ranging from essentially none (or less than the study detection level) to a rate of 0.26 cases per person-year. Considering these two perspectives collectively, and associated uncertainties, EPA has developed an analytical approach and model for generating a national estimate of annual AGI illness due to drinking water. EPA developed a national estimate of waterborne disease to address, in part, the 1996 Safe Drinking Water Act Amendments. The national estimate uses best available science, but also recognizes gaps in the data to support some of the model assumptions and uncertainties in the estimate. Based on the model presented, EPA estimates a mean incidence of AGI attributable to drinking water of 0.06 cases per year (with a 95% credible interval of 0.02–0.12). The mean estimate represents approximately 8.5% of cases of AGI illness due to all causes among the population served by community water systems. The estimated incidence translates to 16.4 million cases/year among the same population. The estimate illustrates the potential usefulness and challenges of the approach, and provides a focus for discussions of data needs and future study designs. Areas of major uncertainty that currently limit the usefulness of the approach are discussed in the context of the estimate analysis.

Tap Water Linked to Increase in Bladder Cancer

Reynolds, K.A., Water Conditioning & Purification, July 2006

As water treatment professionals, maybe you’ve been alerted to news stories suggesting a connection between tap water consumption and bladder cancer, but are these headlines true or just media hype? Although the most recently reported association of tap water consumption with bladder cancer is indeed based on numerous epidemiological studies with an international scope, all scientific research must be carefully evaluated; not just in terms of the data found, but also for the information possibly missed. The study that has everyone talking again about tap water consumption and its relationship to bladder cancer was published in the International Journal of Cancer (April 2006). Looking at data from six epidemiological studies, conducted in five countries worldwide (Canada, Finland, France, Italy and two in the United States), a significant association was found between tap water consumption and bladder cancer among men. The risk increased with consumption of greater volumes, suggesting that carcinogenic chemicals in tap water were responsible for the increased risk. While the information presented appears to be sound, it is important to understand the limitations of the study approach so that the data can be appropriately analyzed with respect to public health significance.

Despite a gender bias and inconsistent reports in the historical literature, this study seems to have sturdy legs to stand on or to at least justify continued research. As mentioned earlier, epidemiology is not a very sensitive science and is complicated by unknown confounders. In addition, this study provides no evidence as to what specific factors related to tap water are causing an increase in cancer, where other drinking water sources (i.e., bottled water) show no association. Water is clearly a heterogeneous mix of contaminants, with vast geographical and temporal fluctuations. Little is known about the combined effects of multiple contaminants found in drinking water, thus a study of single contaminants and their association with cancer risks would not provide a complete picture of overall exposures.

Volatile Organic Compounds in the Nation’s Drinking-Water Supply Wells – What Findings May Mean to Human Health

U.S. Geological Survey, June 2006

When volatile organic compounds (VOCs) are detected in samples from drinking-water supply wells, it is important to understand what these results may mean to human health. As a first step toward understanding VOC occurrence in the context of human health, a screening-level assessment was conducted by comparing VOC concentrations to human-health benchmarks. One sample from each of 3,497 domestic and public wells was analyzed for 55 VOCs; samples were collected prior to treatment or blending. At least one VOC was detected in 623 well samples (about 18 percent of all well samples) at a threshold of 0.2 part per billion. Eight of the 55 VOCs had concentrations greater than human-health benchmarks in 45 well samples (about 1 percent of all well samples); these concentrations may be of potential human-health concern if the water were to be ingested without treatment for many years. VOC concentrations were less than human-health benchmarks in most well samples with VOC detections, indicating that adverse effects are unlikely to occur, even if water with such concentrations were to be ingested over a lifetime. Seventeen VOCs may warrant further investigation because their concentrations were greater than, or approached, human-health benchmarks.

An Approach for Developing a National Estimate Of Waterborne Disease Due to Drinking Water and a National Estimate Model Application

Michael Messner, Susan Shaw, Stig Regli, Ken Rotert, Valerie Blank and Jeff Soller, Journal of Water and Health, 2006;04.suppl2:201-40

In this paper, the US Environmental Protection Agency (EPA) presents an approach and a national estimate of drinking water related endemic acute gastrointestinal illness (AGI) that uses information from epidemiologic studies. There have been a limited number of epidemiologic studies that have measured waterborne disease occurrence in the United States. For this analysis, we assume that certain unknown incidence of AGI in each public drinking water system is due to drinking water and that a statistical distribution of the different incidence rates for the population served by each system can be estimated to inform a mean national estimate of AGI illness due to drinking water. Data from public water systems suggest that the incidence rate of AGI due to drinking water may vary by several orders of magnitude. In addition, data from epidemiologic studies show AGI incidence due to drinking water ranging from essentially none (or less than the study detection level) to a rate of 0.26 cases per person-year. Considering these two perspectives collectively, and associated uncertainties, EPA has developed an analytical approach and model for generating a national estimate of annual AGI illness due to drinking water. EPA developed a national estimate of waterborne disease to address, in part, the 1996 Safe Drinking Water Act Amendments. The national estimate uses best available science, but also recognizes gaps in the data to support some of the model assumptions and uncertainties in the estimate. Based on the model presented, EPA estimates a mean incidence of AGI attributable to drinking water of 0.06 cases per year (with a 95% credible interval of 0.02–0.12). The mean estimate represents approximately 8.5% of cases of AGI illness due to all causes among the population served by community water systems. The estimated incidence translates to 16.4 million cases/year among the same population. The estimate illustrates the potential usefulness and challenges of the approach, and provides a focus for discussions of data needs and future study designs. Areas of major uncertainty that currently limit the usefulness of the approach are discussed in the context of the estimate analysis.

Analysis of Bromate and Bromide in Blood

Quinones, O., Snyder, S.A., Cotruvo, J.A., Fisher, J.W., Toxicology, April 2006

Bromate is a regulated disinfection byproduct primarily associated with the ozonation of water containing bromide, but also is a byproduct of hypochlorite used to disinfect water. To study the pharmacokinetics of bromate, it is necessary to develop a robust and sensitive analytical method for the identification and quantitation of bromate in blood. A critical issue is the extent to which bromate is degraded presystemically and in blood at low (environmentally relevant) doses of ingested bromate as it is delivered to target tissue. A simple isolation procedure was developed using blood plasma spiked with various levels of bromate and bromide. Blood proteins and lipids were precipitated from plasma using acetonitrile. The resulting extracts were analyzed by ion-chromatography with inductively-coupled plasma mass spectrometry (IC-ICP/MS), with a method reporting limit of 5 ng/mL plasma for both bromate and bromide. Plasma samples purchased commercially were spiked with bromate and stored up to 7 days. Over the 7 day storage period, bromate decay remained under 20% for two spike doses. Decay studies in plasma samples from spiked blood drawn from live rats showed significant bromate decay within short periods of time preceding sample freezing, although samples which were spiked, centrifuged and frozen immediately after drawing yielded excellent analytical recoveries.

Research Strategy for Developing Key Information on Bromate’s Mode of Action

Bull, R.J. and Cotruvo, J.A., Toxicology, April 2006

Bromate is produced when ozone is used to treat waters that contain trace amounts of bromide ion. It is also a contaminant of hypochlorite solutions produced by electrolysis of salt that contains bromide. Both ozone and hypochlorite are extensively used to disinfect drinking water, a process that is credited with reducing the incidence of waterborne infections diseases around the world. In studies on experimental animals, bromate has been consistently demonstrated to induce cancer, although there is evidence of substantial species differences in sensitivity (rat > mouse > hamster). There are no data to indicate bromate is carcinogenic in humans. An issue that is critical to the continued use of ozone as a disinfectant for drinking water in bromide-containing waters depends heavily on whether current predictions of carcinogenic risk based on carcinogenic responses in male rats treated with bromate are accurate at the much lower exposure levels of humans. Thiol-dependent oxidative damage to guanine in DNA is a plausible mode of action for bromate-induced cancer. However, other mechanisms may contribute to the response, including the accumulation of α2u-globulin in the kidney of the male rat. To provide direction to institutions that have an interest in clarifying the toxicological risks that bromate in drinking water might pose, a workshop funded by the Awwa Research Foundation was convened to lay out a research strategy that, if implemented, could clarify this important public health issue. The technical issues that underlie the deliberations of the workshop are provided in a series of technical papers. The present manuscript summarizes the conclusions of the workgroup with respect to the type and timing of research that should be conducted. The research approach is outlined in four distinct phases that lay out alternative directions as the research plan is implemented. Phase I is designed to quantify pre-systemic degradation, absorption, distribution, and metabolism of bromate and to associate these with key events for the induction of cancer and develop an initial pharmacokinetic (PK) model based on preliminary studies. Phase II will be implemented if it appears that there is a linear relationship between external dose and key event responses and is designed to gather carcinogenesis data in female rats in the absence of α2u-globulin-induced nephropathy which the workgroup concluded was a probable contributor to the responses observed in the male rats for which detailed dose–response data were collected. If the key events and external dosimetry are found not to be linear in Phase I, Phase III is initiated with a screening study of the auditory toxicity of bromate to determine if it is likely to be exacerbated by chronic exposure. If this occurs, auditory toxicity will be further evaluated in Phase IV. If auditory toxicity is determined unlikely to occur, an alternative chronic study in female rats to the one identified in Phase II will be implemented to include exposure in utero. This was recommended to address the possibility that the fetus may be more susceptible. One of the three options are to be implemented in Phase IV depending upon whether preliminary data indicated that chronic auditory toxicity, reproductive and/or developmental toxicities, or a combination of these outcomes is necessary to characterize the toxicology of low dose exposures to bromate. Each phase of the research will be accompanied by further development of pharmacokinetic models to guide collection of appropriate data to meet the needs of the more sophisticated studies. It is suggested that a Bayesian approach be utilized to develop a final risk model based upon measurement of prior observations from the Phase I studies and the set of posterior observations that would be obtained from whichever chronic study is conducted.

  • Bromate;
  • Research to improve risk assessment;
  • Drinking water

Experimental Results from the Reaction of Bromate Ion with Synthetic and Real Gastric Juices

Keith, J.D., Pacey, G.E., Cotruvo, J.A., Gordon,G., Toxicology, February 2006

This study was designed to identify and quantify the effects of reducing agents on the rate of bromate ion reduction in real and synthetic gastric juice. This could be the first element in the sequence of a pharmacokinetic description of the fate of bromate ion entering the organism, being metabolized, and subsequently being tracked through the system to the target cell or eliminated. Synthetic gastric juice containing H+ and Cl did exhibit reduced bromate ion levels, but at a rate that was too slow for a significant amount of bromate to be reduced under typical stomach retention time conditions. The reaction orders for Cl and H+ were 1.50 and 2.0, respectively. Addition of the reducing agents hydrogen sulfide (which was shown to be present and quantified in real gastric juice), glutathione, and/or cysteine increased the rate of bromate ion loss. All of the reactions showed significant pH effects. Half-lives as short as 2 min were measured for bromate ion reduction in 0.17 M H+ and Cl and 10−4 M H2S. Therefore, the lifetime of bromate ion in solutions containing typical gastric juice concentrations of H+, Cl, and H2S is 20–30 min. This rate should result in as much as a 99% reduction of bromate ion during its residence in the stomach. Bromate ion reduction in real gastric juice occurred at a rapid rate. A comparison of real and synthetic gastric juice containing H+, Cl, cysteine, glutathione, and hydrogen sulfide showed that the component most responsible for the considerable decrease of the concentration of bromate ion in the stomach is hydrogen sulfide.

  • Bromate;
  • Gastric juice;
  • Ion chromatography;
  • Hydrogen sulfide

Bottled Water Production in the United States: How Much Ground Water is Actually Being Used?

Keith N. Eshelman, Ph.D. Associate Professor, University of Maryland, Center for Environmental Studies, May 2005

A comprehensive, quantitative survey of bottled water producers in the U.S. that reveals data collected on bottled water production, specifically production from ground water, the primary source of bottled water.Relative to other uses of ground water, bottled water production was found to be a de minimus user of ground water.

Analysis of the February 1999 Natural Resources Defense Council Report on Bottled Water

Drinking Water Research Foundation, 1999

In February 1999, the Natural Resources Defense Council (NRDC) issued a report entitled “Bottled Water: Pure Drink or Pure Hype?” in which numerous wrong allegations against bottled water are raised. This document provides an extensive analysis and rebuttal of NRDC’s conclusions, highlighting the various mistakes and wrong allegations made by NRDC.