Stain repellent chemical linked to thyroid disease in adults

January 21, 2010

A study by the University of Exeter and the Peninsula Medical School for the first time links thyroid disease with human exposure to perfluorooctanoic acid (PFOA). PFOA is a persistent organic chemical used in industrial and consumer goods including nonstick cookware and stain- and water-resistant coatings for carpets and fabrics.

Published in the journal Environmental Health Perspectives, The study revealed that people with higher concentrations of PFOA in their blood have higher rates of thyroid disease. The researchers analysed samples from the US Centers for Disease Control and Prevention’s nationally representative National Health and Nutrition Examination Survey (NHANES).

Tamara Galloway, a professor Ecotoxicology at the University of Exeter and the study’s senior author, says: “Our results highlight a real need for further research into the human health effects of low-level exposures to environmental chemicals like PFOA that are ubiquitous in the environment and in people’s homes. We need to know what they are doing.”

“There have long been suspicions that PFOA concentrations might be linked to changes in thyroid hormone levels,” adds study author, David Melzer, a professor of Epidemiology and Public Health at the Peninsula Medical School. “Our analysis shows that in the ‘ordinary’ adult population there is a solid statistical link between higher concentrations of PFOA in blood and thyroid disease.”

PFOA is a very stable man-made chemical that excels at repelling heat, water, grease, and stains. It is used during the process of making common household and industrial items including nonstick pots and pans, flame-resistant and waterproof clothing, wire coatings, and chemical-resistant tubing. PFOA can also be formed by the break-down of certain other highly fluorinated chemicals used in oil and grease-resistant coatings on fast-food containers and wrappers and in stain-resistant carpets, fabrics, and paints.

The study included 3966 adults aged 20 and older whose blood serum was sampled between 1999 and 2006 for PFOA and other perfluoroalkyl acid (PFAA) compounds, including perfluoroctane sulphonate (PFOS). The researchers found that the individuals with the highest 25% of PFOA concentrations (above 5.7ng/ml) were more than twice as likely to report current thyroid disease than individuals with the lowest 50% of PFOA concentrations (below 4.0ng/ml). The most specific analysis included 163 women and 46 men who reported having current thyroid disease and who were taking thyroid medication at the time the blood samples were taken. The models used in the analysis were adjusted for potential confounding factors, such as age, gender, ethnicity, smoking, and body mass index.

Previous animal studies carried out by other scientists have shown that the compounds can affect the function of the mammalian thyroid hormone system. This is essential for maintaining heart rate, regulating body temperature and supporting many other body functions, including metabolism, reproduction, digestion and mental health.

The findings are important because research has shown that PFAAs are found in water, air and soil throughout the world, even in remote polar regions. PFOA and PFOS have also been detected in the blood of people from across the globe, as well as in wildlife including birds, fish, and polar bears.

The main source of human exposure to PFOA and PFOS remains uncertain but is believed to be through diet. However, people may also be exposed through the PFAAs used in consumer goods such as textiles, footwear, furniture, and carpets, which can contaminate indoor air and dust.

Although more research is needed to understand the mechanism by which PFOA and PFOS may affect human thyroid functioning, it is plausible that the compounds could disrupt binding of thyroid hormones in the blood or alter their metabolism in the liver. However, this new evidence does not rule out the possibility that having thyroid disease changes the way the body handles PFOA and/or PFOS. The presence of the compounds might also prove to be simply a marker for some other factor associated with thyroid disease.

Thyroid diseases, particularly hypothyroidism, are much more common in women than men. However, in terms of the link between PFOA and thyroid disease, the researchers found no evidence of a statistically different effect between the sexes. The researchers also found a link between thyroid disease and higher concentrations of PFOS in men, but not in women.

Although previous studies of people living in communities near where PFOA and PFOS are manufactured did not find an association between exposure to these chemicals and thyroid hormone functioning, the largest study of such exposed communities is currently underway. (The ‘C8’ study of communities near DuPont’s Washington Works plant, including Marietta, OH, and Parkersburg, WV, both in the US).

In addition to Galloway and Melzer, the paper’s authors include Neil Rice of the Peninsula Medical School’s Epidemiology and Public Health Group; Michael H Depledge of the Peninsula Medical School’s European Centre for the Environment and Human Health; and William E Henley of the School of Mathematics and Statistics of the University of Plymouth . They used the U.S. NHANES dataset because it is the only large-scale data available on PFOA and PFOS in a ‘general’ population anywhere in the world.

Even a small dietary reduction in salt could mean fewer heart attacks, strokes and deaths

January 21, 2010

Reducing salt in the American diet by as little as one-half teaspoon (or three grams) per day could prevent nearly 100,000 heart attacks and 92,000 deaths each year, according to a new study. Such benefits are on par with the benefits from reductions in smoking and could save the United States about USD24 billion in healthcare costs, the researchers add.

A team from the University of California, San Francisco, Stanford University Medical Center and Columbia University Medical Center conducted the study. The findings appear January 20 in online publication by the New England Journal of Medicine and also will be reported in the February 18 print issue of the journal.

The team’s results were derived from the Coronary Heart Disease Policy Model, a computer simulation of heart disease among U.S. adults that has been used by researchers to project benefits from public health interventions.

“A very modest decrease in the amount of salt, hardly detectable in the taste of food, can have dramatic health benefits for the U.S.,” said Kirsten Bibbins-Domingo, PhD, MD, lead author of the study, UCSF associate professor of medicine and epidemiology and the co-director of the UCSF Center for Vulnerable Populations at San Francisco General Hospital.

“It was a surprise to see the magnitude of the impact on the population, given the small reductions in salt that we were modeling,” Bibbins-Domingo added.

The CHD Policy Model found that reducing dietary salt by three grams per day (about 1200 mg of sodium) would result in 11 percent fewer cases of new heart disease, 13 percent fewer heart attacks, 8 percent fewer strokes, and 4 percent fewer deaths. For African Americans, who researchers believe are more likely to have high blood pressure and may be more sensitive to salt, this degree of salt reduction could reduce new cases of heart disease by 16 percent and heart attacks by 19 percent.

“Reducing dietary salt is one of those rare interventions that has a huge health benefit and actually saves large amounts of money. At a time when so much public debate has focused on the costs of health care for the sick, here is a simple remedy, already proven to be feasible in other countries,” said Lee Goldman, MD, MPH, senior author, executive vice president for health and biomedical sciences and dean of the faculties of health sciences and medicine at Columbia University.

The American Heart Association reports that salt consumption among Americans has risen by 50 percent and blood pressure has risen by nearly the same amount since the 1970s – despite evidence linking salt intake to high blood pressure and heart disease.

“In addition to its independent benefits on blood pressure, reducing salt intake can enhance the effects of most anti-hypertensive (blood pressure lowering) agents and reduce complications associated with diabetes, obesity and kidney disease,” said Glenn M. Chertow, MD, study co-author, professor of medicine and chief of the Division of Nephrology at Stanford University.

According to federal government data, the average American man consumes more than10 grams of salt (4000 mg sodium) daily. Most health organizations recommend far lower targets – no more than 5.8 grams of salt per day (2300 mg sodium) and less than 3.8 grams for those over 40. Each gram of salt contains 0.4 grams of sodium.

“It’s clear that we need to lower salt intake, but individuals find it hard to make substantial cuts because most salt comes from processed foods, not from the salt shaker,” Bibbins-Domingo said. “Our study suggests that the food industry and those who regulate it could contribute substantially to the health of the nation by achieving even small reductions in the amount of salt in these processed foods.”

The New York City Department of Public Health and other state and local municipalities nationally have seen salt as an important target for regulation. Mayor Michael Bloomberg has already made sweeping changes to the City’s health regulations, including cutting trans fats in eating places and requiring fast-food restaurant menus to list calories. Now the city is seeking to join a national movement in cutting salt intake by 25 percent, which he referenced in today’s State of the City address.

“Our projects suggest that these regulatory efforts could both improve health and save money because of the healthcare costs avoided,” said Bibbins-Domingo. “For every dollar spent in regulating salt, anywhere from seven to 76 healthcare dollars could be saved.”

Additional authors include Pamela G. Coxson, PhD; James M. Lightwood, PhD, and Mark J. Pletcher, MD, all of UCSF, and Andrew Moran, MD, Columbia. The study was funded in part by a grant from the American Heart Association Western States Affiliate and a pilot grant from the UCSF Clinical and Translational Sciences Institute.

How to prevent toxic exposures in the environment

January 21, 2010

University of California, San Francisco has launched online and print resources designed to help consumers make smarter decisions about substances that can harm general and reproductive health. A new brochure and web page include specific tips on reducing exposure to metals and synthetic chemicals in everyday life – at home, at work, and in the community – and provide links to other sources with more detailed information.

A brochure titled Toxic Matters, created by an alliance of partners led by the UCSF Program on Reproductive Health and the Environment (PRHE), offers practical recommendations for women, men and children that apply to everyone regardless of whether a person is pregnant or planning to have children in the future. It also covers how to become a conscientious shopper when purchasing household products and how to support public policies to stop chemical pollution before it happens.

The brochure and links to further resources are available on the PRHE website: http://www.prhe.ucsf.edu/prhe/toxicmatters.html. A downloadable pdf of the brochure also is available: http://www.prhe.ucsf.edu/prhe/pdfs/ToxicMatters.pdf.

The alliance, called FASTEP (From Advancing Science to Ensuring Prevention), comprises academic, government and non-governmental partners in reproductive, environmental, occupational, pediatric health and toxicology.

“Our goal is to engage the clinical community and consumers through education and access to resources to protect this and the next generation from environmental exposures,” said Tracey Woodruff, PhD, MPH, director of PRHE, which is part of the UCSF National Center of Excellence in Women’s Health.

“We’ve identified key areas where exposures are constant and avoidable, and a means for individuals to contact government representatives to prevent impacts of environmental contaminants on future generations. Although certain groups are most vulnerable, toxic substances in the environment affect every person, every day and are the responsibility of all of us,” she said.

The brochure consists of five pages of straightforward information to help consumers and physicians:

  • Prevent exposure at home
  • Prevent exposure in the workplace
  • Prevent exposure in the community
  • Become a smart consumer
  • Make the government work for you

    “Reducing exposure to toxins is an important but complicated issue. We felt it was important to capture the expertise of multiple disciplines into one, easy-to-use format so that consumers and healthcare professionals can better understand the impact of toxic substances on reproductive health, and what can be done to prevent those exposures,” said Nancy Milliken, MD, vice dean, UCSF School of Medicine and director, UCSF National Center of Excellence in Women’s Health.

    “Clinicians, patients and the public need to know that exposure to toxic substances in the womb or during infancy, childhood, puberty and adulthood can lead to disease early or later in life and across generations,” said Linda C. Giudice, MD, PhD, professor and chair of the UCSF Department of Obstetrics, Gynecology and Reproductive Sciences and founder of PRHE.

    “They also need tools to help them take action in clinical and policy arenas to prevent hazardous environmental exposures,” she said.

    A wide range of peer-reviewed research conducted by scientists at UCSF, nationally and internationally has increasingly documented that environmental exposures to toxic chemicals encountered in everyday places impact reproductive health in a number of ways and affect both women and men. Studies have shown that developing fetuses and children are especially vulnerable to environmental contaminants.

    “While many questions remain, the strength of the evidence is sufficiently strong that leading scientists and clinicians have urged timely action to prevent harm,” said Giudice.

    The latest science on how exposure to chemicals may impair reproductive health is summarized in another publication, Shaping Our Legacy, which presents findings and recommendations from the UCSF Collaborative on Health and the Environment’s 2007 Summit on Environmental Challenges to Reproductive Health and Fertility. The publication also outlines actions to create environments that are healthier for fertility and reproduction.

  • Genetic analysis gives hope that extinct tortoise species may live again

    January 19, 2010

    Thanks to genetic data gleaned from the bones found in a several museum collections, an international team of researchers led by scientists from Yale believes it may be possible to resurrect a tortoise species hunted to extinction by whalers visiting the Galapagos Islands during the early 19th century, before Charles Darwin made his famous visit.

    A genetic analysis of 156 tortoises living in captivity and the DNA taken from remains of specimens of the now-extinct Chelonoidis elephantopus revealed that nine are descendants of the vanished species, which once made its home on Floreana Island in the Galapagos. Over a few generations, a selective breeding program among these tortoises should be able to revive the C. elephantopus species, said Adalgisa Caccone, senior research scientist in the department of ecology and evolutionary biology at Yale and senior author of the piece published this week in the online journal PLoS ONE.

    “Theoretically, we can rescue a species that has gone extinct,’’ Caccone said. “Our lab calls it the Lazurus project.”

    In 2007, Caccone and others discovered genetic relatives to “Lonesome George” the last known survivor of another species of Galapagos tortoise and an icon of the conservation movement. The team believes that the similar genetic hybrids living in captivity on the Galapagos were descendents of tortoises that were taken by whalers as future meals but then thrown overboard to make room for the more lucrative cargo of whale blubber. These tortoises then swam to nearby islands and mated with natives there. Floreana’s flat topography made it a popular spot for whalers to stop and snatch tortoises for meals, leading to the extinction of C. elephantopus.

    The comparison of genetic data from remains in museums to data banks with DNA sequences of living tortoises made it possible to identify relatives of extinct animals, Caccone said. However, it will take at least four generations of selective breeding – about 100 years – to bring a genetically identical member of C. elephantopus “back to life.”

    “We won’t be around to see it, but it can be done,” she said.

    Other Yale authors of the paper are Edgar Benavides and Jeffrey R. Powell. Researchers from the University of British Columbia Okanagan, University of Crete, State University New York Syracuse and the Galapagos National Park Service contributed to the research.

    The study was funded by the National Geographic Society, the Paul and Bay Foundation, the Eppley Foundation, the Turtle Conservation Fund, and Galapagos Conservancy.

    Tree planting in Kenya’s Mau Complex signals new beginnings for a critical ecosystem

    January 19, 2010

    Kenya took a step to restore its diminishing water towers and address rapid environmental degradation when it launched a tree planting drive in the Kiptunga area of the Mau Forest Complex on Friday.

    20,000 tree seedlings were planted on 20 hectares at a ceremony attended by Kenya’s Prime Minister Raila Odinga and United Nations Environment Programme (UNEP) Deputy Executive Director, Angela Cropper.

    Mau, the largest indigenous forest in East Africa and Kenya’s most vital water tower, covers some 270,000 hectares. After Mau, restoration will also take place in Mt. Kenya, Aberdares, Mt. Elgon and the rest of Kenya’s forests and water catchment areas with the aim of increasing the forest cover from the current 1.7 percent to 10 percent by the year 2020.

    In partnership with the government and other stakeholders, including Kenyan NGOs, UNEP has assisted in chronicling and raising awareness about the damage and the degradation of East Africa’s largest closed-canopy forest.

    Over the last two decades, the Mau Complex has lost around 107,000 hectares – approximately 25% – of its forest cover, which has had devastating effects on the country as a whole; including severe droughts and floods, leading to loss of human lives and livelihoods, crops and thousands of head of livestock.

    UNEP’s contribution to the national debate that has surrounded the Mau has been based on science and the economics, and in 2009 it appointed an expert to provide technical advice to a government-led Mau task force.

    One of the findings of this task force was that continued destruction of the forests will inevitably lead to a water crisis of national and regional proportions that extend far beyond the Kenyan borders.

    The impetus to restore the Mau is particularly strong this year, as the world marks the International Year of Biodiversity.

    So far, the international community has failed to reverse the rate of loss of biodiversity. Economies everywhere continue to dismantle the productive life-support systems of planet Earth.

    The latest estimates by The Economics of Ecosystems and Biodiversity (TEEB) study, which UNEP hosts, estimates that up to US$5 trillion-worth of natural or nature-based capital is being lost annually.

    However, through Friday’s tree-planting initiative, the Mau is emerging as a possible inspiring example of how the tide can still be turned in favour of biodiversity and sustainable ecosystem management.

    After planting a Kaligen Berekeiyet tree at the ceremony, UNEP Deputy Executive Director Angela Cropper said: “These first saplings, planted in the soils of Kenya, speak of new shoots and new beginnings. New beginnings for a critical ecosystem: new beginnings for the people of Kenya who depend inextricably on the services that the Mau forest complex generates.”

    UNEP Spokesman Nick Nuttall planted a Podocarpus tree at the event, declaring: “This is the first tree I have planted, ever. It shows that even at 51, it is never too late.”

    New theory on the origin of primates

    January 19, 2010

    A new model for primate origins is presented in Zoologica Scripta, published by the Norwegian Academy of Science and Letters and The Royal Swedish Academy of Sciences. The paper argues that the distributions of the major primate groups are correlated with Mesozoic tectonic features and that their respective ranges are congruent with each evolving locally from a widespread ancestor on the supercontinent of Pangea about 185 million years ago.

    Michael Heads, a Research Associate of the Buffalo Museum of Science, arrived at these conclusions by incorporating, for the first time, spatial patterns of primate diversity and distribution as historical evidence for primate evolution. Models had previously been limited to interpretations of the fossil record and molecular clocks.

    “According to prevailing theories, primates are supposed to have originated in a geographically small area (center of origin) from where they dispersed to other regions and continents” said Heads, who also noted that widespread misrepresentation of fossil molecular clocks estimates as maximum or actual dates of origin has led to a popular theory that primates somehow crossed the globe and even rafted across oceans to reach America and Madagascar.

    In this new approach to molecular phylogenetics, vicariance, and plate tectonics, Heads shows that the distribution ranges of primates and their nearest relatives, the tree shrews and the flying lemurs, conforms to a pattern that would be expected from their having evolved from a widespread ancestor. This ancestor could have evolved into the extinct Plesiadapiformes in north America and Eurasia, the primates in central-South America, Africa, India and south East Asia, and the tree shrews and flying lemurs in South East Asia.

    Divergence between strepsirrhines (lemurs and lorises) and haplorhines (tarsiers and anthropoids) is correlated with intense volcanic activity on the Lebombo Monocline in Africa about 180 million years ago. The lemurs of Madagascar diverged from their African relatives with the opening of the Mozambique Channel (160 million years ago), while New and Old World monkeys diverged with the opening of the Atlantic about 120 million years ago.

    “This model avoids the confusion created by the center of origin theories and the assumption of a recent origin for major primate groups due to a misrepresentation of the fossil record and molecular clock divergence estimates” said Michael from his New Zealand office. “These models have resulted in all sorts of contradictory centers of origin and imaginary migrations for primates that are biogeographically unnecessary and incompatible with ecological evidence”.

    The tectonic model also addresses the otherwise insoluble problem of dispersal theories that enable primates to cross the Atlantic to America, and the Mozambique Channel to Madagascar although they have not been able to cross 25 km from Sulawesi to Moluccan islands and from there travel to New Guinea and Australia.

    Heads acknowledged that the phylogenetic relationships of some groups such as tarsiers, are controversial, but the various alternatives do not obscure the patterns of diversity and distribution identified in this study.

    Biogeographic evidence for the Jurassic origin for primates, and the pre-Cretaceous origin of major primate groups considerably extends their divergence before the fossil record, but Heads notes that fossils only provide minimal dates for the existence of particular groups, and there are many examples of the fossil record being extended for tens of millions of years through new fossil discoveries.

    The article notes that increasing numbers of primatologists and paleontologists recognize that the fossil record cannot be used to impose strict limits on primate origins, and that some molecular clock estimates also predict divergence dates pre-dating the earliest fossils. These considerations indicate that there is no necessary objection to the biogeographic evidence for divergence of primates beginning in the Jurassic with the origin of all major groups being correlated with plate tectonics.

    Wilder weather exerts a stronger influence on biodiversity than steadily changing conditions

    January 19, 2010

    An increase in the variability of local conditions could do more to harm biodiversity than slower shifts in climate, a new study has found.

    Climate scientists predict more frequent storms, droughts, floods and heat waves as the Earth warms. Although extreme weather would seem to challenge ecosystems, the effect of fluctuating conditions on biodiversity actually could go either way. Species able to tolerate only a narrow range of temperatures, for example, may be eliminated, but instability in the environment can also prevent dominant species from squeezing out competitors.

    “Imagine species that have different optimal temperatures for growth. In a fluctuating world, neither can get the upper hand and the two coexist,” said Jonathan Shurin, an ecologist at the University of California, San Diego who led the project. Ecologists have observed similar positive effects on populations of organisms as different as herbacious plants, desert rodents, and microscopic animals called zooplankton.

    Now a study of zooplankton found in dozens of freshwater lakes over decades of time has revealed both effects. Shurin and colleagues found fewer species in lakes with the most variable water chemistry. But lakes with the greatest temperature variations harbored a greater variety of zooplankton, they report in the journal Ecology Letters January 21.

    Their study considered data from nine separate long-term ecological studies that included a total of 53 lakes in North America and Europe. In addition to sampling zooplankton, scientists had also taken physical measurements repeatedly each season for periods ranging from 3 to 44 years.

    From these data, they calculated the variability of 10 physical properties, including pH and the levels of nutrients such as organic carbon, phosphorous and nitrogen. Temperatures and the amount of oxygen dissolved in the water at both the surface and bottom of each lake were also included. The authors also teased apart variation based on the pace of change with year-to-year changes considered separately from changes that occurred from season-to-season or on more rapid timescales.

    Zooplankton populations respond quickly to changes because they reproduces so fast. “In a summer, you’re sampling dozens of generations,” Shurin said. “For mammals or annual plants, you would have to watch for hundreds or thousands of years to see the same population turnover.”

    At every time scale the pattern held: Ecologists found fewer species of zooplankton in lakes with fluctuating water chemistry and greater numbers of species in those with varying temperatures. The authors noted that the temperature variations they observed remained within normal ranges for these lakes. But some chemical measures, particularly pH and levels of phosphorous, strayed beyond normal limits due to pollution and acid rain.

    Environmental variability through time could either promote or reduce biodiversity depending on the pace and range of fluctuations, the authors suggested.

    “It may depend on the predictability of the environment. If you have a lot of violent changes through time, species may not be able to program their life cycles to be active when conditions are right. They need the ability to read the cues, to hatch out at the right time,” Shurin said. “If the environment is very unpredictable, that may be bad for diversity, because many species just won’t be able to match their lifecycles to that.”

    Urban ‘green’ spaces may contribute to global warming

    January 19, 2010

    Amy Townsend-Small, Earth system science postdoctoral researcher, found that management of urban “green” spaces emits more greenhouse gases than the plots take in and store. | Photo by Steve Zylius / University CommunicationsUniversity of California, Irvine study finds Turfgrass management creates more greenhouse gas than plants remove from atmosphere

    Dispelling the notion that urban “green” spaces help counteract greenhouse gas emissions, new research has found – in Southern California at least – that total emissions would be lower if lawns did not exist.

    Turfgrass lawns help remove carbon dioxide from the atmosphere through photosynthesis and store it as organic carbon in soil, making them important “carbon sinks.” However, greenhouse gas emissions from fertiliser production, mowing, leaf blowing and other lawn management practices are four times greater than the amount of carbon stored by ornamental grass in parks, a UC Irvine study shows. These emissions include nitrous oxide released from soil after fertilisation. Nitrous oxide is a greenhouse gas that’s 300 times more powerful than carbon dioxide, the Earth’s most problematic climate warmer.

    “Lawns look great – they’re nice and green and healthy, and they’re photosynthesising a lot of organic carbon. But the carbon-storing benefits of lawns are counteracted by fuel consumption,” said Amy Townsend-Small, Earth system science postdoctoral researcher and lead author of the study, forthcoming in the journal Geophysical Research Letters.

    The research results are important to greenhouse gas legislation being negotiated. “We need this kind of carbon accounting to help reduce global warming,” Townsend-Small said. “The current trend is to count the carbon sinks and forget about the greenhouse gas emissions, but it clearly isn’t enough.”

    Turfgrass is increasingly widespread in urban areas and covers 1.9 percent of land in the continental U.S., making it the most common irrigated crop.

    In the study, Townsend-Small and colleague Claudia Czimczik analysed grass in four parks near Irvine, Calif. Each park contained two types of turf: ornamental lawns (picnic areas) that are largely undisturbed, and athletic fields (soccer and baseball) that are trampled and replanted and aerated frequently.

    The researchers evaluated soil samples over time to ascertain carbon storage, or sequestration, and they determined nitrous oxide emissions by sampling air above the turf. Then they calculated carbon dioxide emissions resulting from fuel consumption, irrigation and fertiliser production using information about lawn upkeep from park officials and contractors.

    The study showed that nitrous oxide emissions from lawns were comparable to those found in agricultural farms, which are among the largest emitters of nitrous oxide globally.

    In ornamental lawns, nitrous oxide emissions from fertilisation offset just 10 percent to 30 percent of carbon sequestration. But fossil fuel consumption for management, the researchers calculated, released about four times more carbon dioxide than the plots could take up. Athletic fields fared even worse, because – due to soil disruption by tilling and resodding – they didn’t trap nearly as much carbon as ornamental grass but required the same emissions-producing care.

    “It’s impossible for these lawns to be net greenhouse gas sinks because too much fuel is used to maintain them,” Townsend-Small concluded.

    Previous studies have documented lawns storing carbon, but this research was the first to compare carbon sequestration to nitrous oxide and carbon dioxide emissions from lawn grooming practices.

    The UCI study was supported by the Kearney Foundation of Soil Science and the U.S. Department of Agriculture.

    Measuring carbon dioxide over the ocean

    January 19, 2010

    Reliable measurements of the air-sea flux of carbon dioxide – an important greenhouse gas – are needed for a better understanding of the impact of ocean-atmosphere interactions on climate. A new method developed by researchers at the National Oceanography Centre, Southampton (NOCS) working in collaboration with colleagues at the Bjerknes Center for Climate Research (Bergen, Norway) promises to make this task considerably easier.

    Infrared gas sensors measure carbon dioxide based on its characteristic absorption spectra and are used to evaluate the air-sea flux of the gas. So-called closed-path sensors precondition air before measurements are made, while open-path sensors can be used to measure the air in situ.

    One advantage of using open-path sensors at sea is that wind measurements can be taken contemporaneously in the same place. Moreover, because they are small and don’t use much power they can be used on buoys.

    “Open-path sensors have the potential greatly to increase our understanding of the variability of air-sea carbon dioxide fluxes,” said PhD student John Prytherch of the University of Southampton’s School of Ocean and Earth Science at NOCS.

    However, a long-standing concern has been that the values from open-path sensors do not tally with those from closed-path sensors, or with measurements made using other techniques.

    “Other scientists have been sceptical about the reliability of carbon dioxide flux measurements taken at sea using open-path sensors,” says Prytherch: “However, we now believe that we understand the reason for the discrepancy and that we can correct for it.”

    The problem turns out to be that the sensors are sensitive to humidity, meaning that fluctuations in the amount of water vapour in the sample air skew the carbon dioxide measurements. This is probably caused by salt particles on the sensor lens that absorb water.

    Having identified the problem, Prytherch and his colleagues developed and rigorously tested a novel method for correcting the data for the cross-sensitivity to humidity.

    Data were collected aboard the Norwegian weather ship Polarfront, equipped with a battery of instruments to measure wind speed, humidity and carbon dioxide. Even the motion of the ship was monitored.

    The researchers noted that the carbon dioxide fluxes calculated from open-path sensor data were clearly too high and affected by humidity. They were also very variable, suggesting that the effect is caused by salt on the optics, which accumulate before being washed off by rain. Indeed, the researchers were able to mimic this effect in the laboratory.

    However, after correction using their newly developed method, the calculated carbon dioxide fluxes were in line with previous studies that used different sensors or techniques.

    “This robust method opens the way for widespread use of open-path sensors for air-sea carbon dioxide flux estimation,” said Dr Margaret Yelland of NOCS: “This will greatly increase the information available on the transfer of carbon dioxide between the air and sea – information crucial for understanding how the ocean-atmosphere interaction impacts climate.”

    Low concentrations of oxygen and nutrients slowing biodegradation of Exxon Valdez oil

    January 19, 2010

    Oil from the Exxon Valdez spill continues to be found in the beaches along Alaska’s Prince William Sound. Temple University researchers have found that the low concentrations of oxygen and nutrients in the beaches, along with the water flow in the beach’s lower layer, have hindered the aerobic biodegradation of the remaining oil. | Credit: Michel Boufadel/Temple universityThe combination of low concentrations of oxygen and nutrients in the lower layers of the beaches of Alaska’s Prince William Sound is slowing the aerobic biodegradation of oil remaining from the 1989 Exxon Valdez spill, according to researchers at Temple University.

    Considered one of the worst environmental disasters in history, the Exxon Valdez spilled more than 11 million gallons of crude oil into Alaska’s Prince William Sound, contaminating some 1,300 miles of shoreline, killing thousands of wildlife and severely impacting Alaska’s fishing industry and economy.

    In the first five years after the accident, the oil was disappearing at a rate of about 70 percent and calculations showed the oil would be gone within the next few years. However, about seven or eight years ago it was discovered that the oil had in fact slipped to a disappearance rate of around four percent a year and it is estimated that nearly 20,000 gallons of oil remains in the beaches.

    The researchers, lead by Michel Boufadel, director of the Center for Natural Resources Development and Protection in Temple’s College of Engineering, have been studying the cause of the remaining oil for the past three years.

    Their study, “Long-term persistence of oil from the Exxon Valdez spill in two-layer beaches,” was posted Jan. 17 in advance of publication on Nature Geoscience’s Web-site (http://www.nature.com/ngeo/index.html).

    Boufadel said the beaches they studied consisted of two layers: an upper layer that is highly permeable and a lower layer that has very low permeability. He said that, on average, water moved through the upper layer up to 1,000-times faster than the lower layer, and while both layers are made up of essentially the same materials, the lower layer has become more compacted through the movement of the tides over time.

    These conditions, said Boufadel, have created a sort of sheltering effect on the oil, which often lies just 1-4 inches below the interface of the two layers.

    Boufadel said that oxygen and nutrients are needed for the survival of micro-organisms that eat the oil and aid in aerobic biodegradation of the oil. But without the proper concentrations of the nutrients and oxygen along with the slow movement of water, anaerobic biodegradation is probably occurring, which is usually very slow.

    Boufadel, who is also chair of the Department of Civil and Environmental Engineering at Temple, said that an earlier study, published in 1994, had already established a low concentration of nutrients was affecting the remaining Exxon Valdez oil.

    He said that because of Alaska’s pristine environment, you would expect to find a low concentration of nutrients and this recent study confirmed the earlier findings. What Boufadel and his team found was, on average, that the nutrient concentration in the beaches was 10 times lower than what is required for optimal aerobic biodegradation of oil. They also found that the oxygen levels in the beaches are also insufficient to sustain aerobic biodegradation.

    Using groundwater hydraulic studies, the researchers found that the net movement of water through the lower layer of beach was outwards, so it is preventing oxygen from diffusing through the upper layer to where the oil is located.

    “You have a high amount of oxygen in the seawater, so you would like to think that the oxygen would diffuse in the beach and get down 2-4 inches into the lower layer and get to the oil,” said Boufadel. “But the outward movement of the water in the lower level is blocking the oxygen from spreading down into that lower layer.”

    Boufadel and his team are now exploring ways to deliver the much needed oxygen and nutrients to the impacted areas in an effort to spur aerobic biodegradation of the remaining oil.