We would be happy to connect you with our business and sustainability experts, best practices, case studies and more.
Nancy Buzby email
Senior Director of Marketing, Strategic Initiatives
Alex Marchyshyn email
Marketing Communications Coordinator, Corporate Partnerships
New York Office
Steven Goldman email
Marketing Communications Coordinator, Corporate Partnerships
Washington, DC Office
Latest Tweets from EDFBizTweets by @EDFbiz
Latest Press Releases
- Crowdsourcing better data on small-scale fisheries
By Kendra Karr
Many of the world’s fish are caught in small-scale fisheries that lack data about the health of fish populations, giving managers very limited information to base management decisions on. In turn, most of these fisheries appear to be under-performing with respect to conservation, the amount of food they can produce, the amount of money they can generate, and the quality of the livelihoods they can support. There is a perception that these fisheries cannot be assessed without large amounts of data. Because of this perception, many fisheries remain unassessed, ineffectively managed or not managed at all leading to under performance or even collapse.
Fortunately, there are alternatives: fishermen and women, community members, managers and scientists are collaborating to bridge the data gap for these important fishing communities; increasing knowledge and resources for effective fishery assessment and management. While these collaborations have started to fill in the gaps, we still need input from fishery managers and practitioners for a complete picture of the data.
In collaboration with small-scale fisheries around the world, we are beginning to collect information on the pathway and tools employed in actions of science-based fishery co-management in small-scale, data-limited contexts.
Context and goals:
Finding ways to evaluate small-scale fisheries means gaining a deeper insight into the pathways and tools used to transition fisheries to more science based solutions. These solutions allow fisheries to meet environmental, social and economic goals. Successful fisheries around the world have shown that establishing secure fishing rights with science-based catch limits not only empowers fishermen to become stewards of the resource, but can also support a pathway to long-term sustainability. Both the pathways and tools employed to reform fisheries vary, but there are a growing number of examples that use a form of co-management along with science-based fishery management.
Case studies help identify the many ways stakeholders address the challenges their fisheries are facing and help develop science-based solutions for sustainable fishing.
Upcoming panel at IMCC:
At this year’s International Marine Conservation Congress (IMCC) in St. John's, Newfoundland and Labrador – in collaboration with five fisheries – we will hear the stories from those involved in transitioning a small-scale, data-limited fishery into a science-based managed fishery. Attendees are encouraged to participate in the symposium – Integrated science and management solutions for data-limited and low governance fisheries – and contribute to the associated panel discussion.
Small-scale fisheries are reforming during a fortunate period, as there are tools designed to empower on the ground partners to address the challenges these fisheries are facing. These tools can be used to develop sustainable solutions that support more fish in the water, more food on the plate and more prosperous communities.
Let’s hear your story, so together we can bridge the gap in knowledge and understanding of the critical resources.
How you can participate:
- Contribute to our survey: Fishery Assessment and Management Pathway.
- Attend our symposium and panel discussion on data limited assessment and co-management of fisheries at this year’s International Marine Conservation Congress (IMCC) on August 1st 8:30-11:30 in Salon F.
- Recommendations for Operating a Sediment Diversion that Balances Ecosystem and Community Needs
By Natalie Peyronnin, Director of Science Policy, Mississippi River Delta Restoration, Environmental Defense Fund
This is part one of the series “Building Land in Coastal Louisiana: Expert Recommendations for Operating a Successful Sediment Diversion that Balances Ecosystem and Community Needs.” This series will explore key recommendations for operating sediment diversions as outlined by the independent Sediment Diversion Operations Expert Working Group.
The use of sediment diversions, a restoration tool that mimics the natural processes of the Mississippi River to build and sustain land, has been proposed for decades in coastal Louisiana. While we move closer to construction of a sediment diversion, the issues surrounding how the reintroduction of fresh water and sediment will impact the ecosystem, including important fish and wildlife species, and the communities that live, work and play in the basin, require close monitoring and planning.
The state of Louisiana, through the Coastal Protection and Restoration Authority (CPRA), will be required to develop an Operation and Adaptive Management Plan that lays out strategies that need to be considered when operating a sediment diversion to maximize land building, while balancing the needs of the ecosystem and communities.
To help CPRA with developing these strategies, the Sediment Diversion Operations Expert Working Group was formed. The working group consisted of 12 interdisciplinary scientists with a wide range of on-the-ground expertise in coastal Louisiana. Together, they released a report with the goal of providing specific recommendations to begin a robust discussion on operation strategies to be considered for CPRA’s plan.
The working group members, along with over 40 guest experts, discussed, debated and documented complex issues such as wetland health, basin geology, fish and wildlife species and socio-economics. The resulting recommendations are included in the report, Building Land in Coastal Louisiana: Expert Recommendations for Operating a Successful Sediment Diversion that Balances Ecosystem and Community Needs.
Five of the topics key to the successful operation of a sediment diversion – sediment, hydrodynamics, vegetation, socio-economics, and fish and wildlife – will be explored further in this blog series written by experts in their respective subject. This series is intended to further detail these important topics that will become extremely relevant once a diversion is constructed.
Stay tuned for the next post in the “Building Land in Coastal Louisiana” series about key recommendations concerning hydrodynamics, titled “Exploring the Hydrodynamics of a Sediment Diversion at Mid-Barataria.”
For more information about the Sediment Diversion Operations Expert Working Group’s key recommendations, visit http://www.mississippiriverdelta.org/diversion-ops-report/.
Natalie Peyronnin is the director of science policy for EDF's Mississippi River Delta Restoration program and the convener of the Sediment Diversion Operations Expert Working Group. She works to ensure sound science is being utilized to plan, design, implement and adaptively manage projects and policies, with a focus on system dynamics. Natalie was a Senior Scientist for Louisiana’s Coastal Protection and Restoration Authority, where she served as the project manager for the 2017 Coastal Master Plan, the technical lead and science communicator for the 2012 Coastal Master Plan, diversion coordinator, and liaison with The Water Institute of the Gulf and academic institutions. Natalie also worked as Science Director for the Coalition to Restore Coastal Louisiana. Natalie has a B.S. in Wildlife and Fisheries Management, with minors in Forestry and Zoology & Physiology from Louisiana State University, as well as an M.S. in Oceanography and Coastal Sciences from Louisiana State University.
- Six benefits of California setting a 2030 climate pollution reduction target
It’s summer recess for the California Legislature which means we have some time to reflect before the race to the end of the legislative session on August 31. A big question is whether the Legislature will pass a climate bill package that would cement ambitious 2030 carbon reduction targets into statute. With the climate spotlight shining brightly on Sacramento – as usual – it’s worth considering why legislative action and leadership is so critical now.
Here are six ways setting a 2030 greenhouse gas target for California could benefit the state’s economy; environment; and future of California’s global climate leadership, especially the groundbreaking cap-and-trade market we forged three years ago with Quebec.
- Cap-and-trade allowance values will more accurately reflect the long-term cost of hitting emissions reductions targets – This is the classic impact that we consider. A 2030 target means that a carbon allowance sold in 2017, for example, will be a valuable asset not just through 2020 but through 2030 as well. This could translate into higher allowance prices, but if it doesn’t, there could be good news reasons for that too, such as the market anticipating a low cost of reducing emissions.
- Regulated polluters will value emission reduction opportunities more highly, potentially leading to lower direct emissions – This relates back to the first point. Imagine that a business needs to buy a new boiler that they expect to last 20 years. They have the option to pay more now for a more efficient boiler or pay less now for a less efficient boiler. They are more likely to invest in the more efficient boiler if they know there will be a price on carbon catalyzing ambitious reductions for at least 15 more years rather than just five more years.
- Allowance banking could increase, creating more incentives for faster emission reductions – This creates an important, but more subtle, environmental benefit. If carbon prices are expected to increase in the future as the cap gets tighter, regulated and non-regulated participants alike will have an incentive to “bank” allowances for future use. (Note that an important California design feature allows a “current vintage allowance”, say 2016, to be used for emissions that occur in 2016 or in any future year.) This banking is equivalent to a reduction that occurs earlier than expected and provides a net benefit to the atmosphere. The concept of banking promotes innovative solutions that cut pollution more quickly, and makes the overall program more flexible for California companies.
- Demand for allowances could increase – If banking increases, the uptick in demand for allowances may mean that all or most allowances offered at the quarterly auctions sell out, even if emissions remain below the cap through 2020. Out-performing the 2020 cap would be a great outcome, meaning that the long-term price signal is likely incentivizing emissions reductions (as in the boiler example above) or that complementary measures (like the Renewable Portfolio Standard) are succeeding.
- Offsets developers will have a stronger incentive to reduce emissions where possible and bring projects to market – Offsets provide an opportunity for all sectors of the economy, like the agricultural sector, to be rewarded for high-quality, innovative emissions reductions. Because robust actions that cut pollution take time and investment, a longer-term commitment by the state is essential. This means that, for example, a rice farmer would be much more likely to transform their growing practices to cut methane emissions if those actions reaped a payoff for 15 instead of just five years.
- Communities, the California work force, and the economy will continue to benefit from Greenhouse Gas Reduction Fund (GRRF) investments – The cap-and-trade program is first and foremost about reducing emissions not raising revenue. While the program’s purpose isn’t to maximize revenue, auctioning is an integral part of California’s well-functioning system and the undeniable benefits of the investments through the GGRF, especially for disadvantaged communities, are an important aspect of program success now and beyond 2020. California’s investments so far have furthered and enhanced the purposes of AB 32 which emphasize reducing carbon pollution in a way that maximizes benefits to the economy and environment, promotes social equity, and transforms the state into a low carbon economy. By sticking to these principles, California is creating benefits that far outweigh the initial investments. And these benefits need to continue!
A process not an event
Setting a 2030 target has been a gradual process in California. And the market may already be operating with some expectation that the cap-and-trade program will continue beyond 2020.
The market has received increasingly specific indications since 2014 that ambitious reductions will continue beyond 2020. For example, before AB 32 was ever passed, Governor Schwarzenegger signed an executive order calling for 1990 levels of emissions by 2020 and an 80% reduction below 1990 levels by 2050. In 2014, the Office of Planning and Research and the Air Resources Board (ARB) started calling for an ambitious mid-term 2030 target, foreshadowing Governor Brown’s 2015 executive order setting that target at 40% below 1990 levels by 2030. Since then, ARB has reopened the Scoping Plan process to meet that 2030 target and begun the regulatory process to amend the cap-and-trade program to extend it to 2030.
With a decade of world-leading, successful climate action behind it, California is on the verge of another momentous step forward. The market will gain even more certainty, and California communities and the economy will score a huge win if the Legislature does the right thing and passes a climate package this year that places a 2030 target in statute.Read more »
- Restoration Solutions: Sediment Diversions
The State of Louisiana is advancing two sediment diversions south of New Orleans. These projects are on track to begin construction by 2020 using funding from the BP oil spill. Multiple projects working together are needed to build and sustain land, but sediment diversions are a crucial foundation needed to confront Louisiana's ongoing land loss crisis.
Learn more about sediment diversions in the fact sheet below.Read more »
- How a Western Regional Electric Grid Can Bring us into the 21st Century
Technology constantly advances, as do related industries. So, why does it seem like time stands still for the electricity industry? Compare telecommunications: Since the arrival of the telephone in the late 1870s, we’ve seen consistent upgrades to phones and transmission technology, including larger and larger coverage areas. But in the West, the way we make, move, trade, and regulate electricity still involves old coverage maps and 38 difference balancing authorities. California is exploring a way to change that on a large scale.
State lawmakers and the California Independent System Operator (CAISO), which controls much of the state’s electric grid, are now considering the first steps to create an integrated, western regional energy market. This process, which many are calling “regionalization,” would connect grid operators throughout the West, so they can share resources more easily and efficiently – a common approach in the eastern and midwestern U.S.
The geographic size of a regional grid would help integrate more renewable energy. Due to California’s impressive clean energy leadership in the last decade, we now produce so much solar power that, at times, we cannot use it all when it’s available. If California’s electric grid were connected to neighboring states, California could export its excess clean energy when the sun is shining here, and buy wind from Wyoming when it isn’t. This would increase our ability to rely on renewables and lower our costs.
Final CAISO studies released earlier this month project a West-wide system operator would lower greenhouse gas emissions considerably by 2030 and help affordably integrate the renewable energy brought online by SB 350. What’s more, it can accomplish this while being designed to respect the diverse policies of each individual state.
For these reasons, Environmental Defense Fund (EDF), Natural Resources Defense Council (NRDC), Union of Concerned Scientists (UCS), and Center for Energy Efficiency and Renewable Technologies (CEERT) sent a letter late last week to California legislative and gubernatorial leadership supporting regionalization, and asking them to ensure greenhouse gas reductions and local air quality benefits.
Potential benefits of a western regional energy market
According to the CAISO studies, a western regional grid could accomplish the following:
- Reduce harmful pollution – Greenhouse gases that cause climate change would drop considerably in 2030.
- Save customers money – California households could save $300-500 per year due to lower-cost renewables, lower energy-import costs, and other savings stemming from a larger, more integrated grid.
- Create more jobs – A regional grid could create between 10,000 and 19,000 jobs each year in California between 2020 and 2030, depending on the state’s choices about its Renewable Portfolio Standard (RPS). At least 13 percent of the possible 10,000 jobs and 23 percent of the possible 19,000 jobs are expected within disadvantaged communities. But the choices about the RPS’s “Bucket System” – the mechanism that categorizes types of renewable energy based on location and other factors for eligibility to meet the state’s clean energy goals – affect these numbers dramatically. In fact, the studies show removing the Bucket System would cost 110,000 solar construction jobs in California between 2020 and 2030.
With potential benefits like these, a well-designed regional grid – one that protects California’s and other western states’ interests – is worth pursuing.
How a Western Regional Electric Grid Can Bring us into the 21st Century
Click To Tweet
Designing a successful regional grid
Specifically, the letter EDF, NRDC, CEERT, and UCS submitted to California’s leadership recommends the following:
- Ensure greenhouse gas emissions benefits – Though the studies show a long-term reduction in greenhouse gases, we understand that a regional grid is designed to be “policy neutral,” meaning it would not impose any particular policies on participating states. To understand the potential impact on California’s greenhouse emissions (as well as emissions outside of California), it will be important to consistently verify greenhouse gas emissions across the region. But tracking emissions doesn’t have to burden the regional market: we could use data from existing processes and states interested in greenhouse gas policies could pay the cost.
- Air quality measures for California communities – Any potential impacts of a western regional grid on harmful air pollution in California’s disproportionately impacted communities must be understood and fully mitigated. This could include directing the California Air Resources Board and the California Public Utilities Commission to work together to authorize clean energy resources that reduce reliance on natural gas in these areas and directly benefit these communities.
- Fair processes for all – The market should include a stakeholder process that provides opportunities for meaningful public participation. Part of that participation should guarantee access to data underlying decisions and the chance to propose alternatives. Importantly, the process should include representatives from a wide variety of interests, including utility and distributed clean energy representatives, customers, and environmental groups.
- State autonomy – California and other states can and must retain control over the ability to develop their own energy policies. In order to achieve this, the regional grid’s governance must acknowledge and respect the rights of all states to oversee how their utilities procure electricity.
A regional energy market is worth it
It is worth creating a regional grid that truly works for California and the West. In doing so, the grid can undoubtedly accommodate even higher levels of clean, renewable resources while reducing pollution and protecting each state’s energy policy autonomy. While regionalization is a complicated process, there is significant experience the west can draw from, as many states outside the region are already successfully participating in multi-state energy markets. By implementing the steps above, California can play an important role in taking the first big step toward a 21st century grid for the West.Read more »
- Tell the US Army Corps: Protect Our Streams and ShorelinesRight now, the US Army Corps of Engineers is considering the best way to protect America's wetlands, rivers, and stream--and you have a limited amount of time to add your input.Read more »
- Mapping lead service lines: DC Water offers a model for utilities across the nation
By Lindsay McCormick
Washington, DC’s water utility launched a helpful interactive map allowing residents to see whether water pipes are lead, non-lead, or if there’s no available information for nearly every building and public water source across DC – including residences, restaurants, retailers, schools, drinking water fountains, and even the White House and Smithsonian.
Lindsay McCormick is a Research Analyst.
When I moved to Washington, DC four years ago the phrase “lead service lines” did not roll off my tongue. That began to change as I became aware of DC’s historical lead problems – and dramatically so in the wake of the crisis in Flint, Michigan.
But I’m not alone. Even though experts estimate that up to 10 million homes across the U.S. have lead service lines – lead pipes connecting the drinking water main in the street to the home – it’s an issue that is not well understood by most Americans.
And that should come as no surprise given that few water utilities across the U.S. can even say with confidence where the lead services lines are in their systems, and fewer still proactively share what information they have with customers. Lead service lines are an aging infrastructure, typically found in communities with older housing. Local recordkeeping over the years has been inconsistent, leaving many utilities today to rely on incomplete, difficult to access, or non-electronic historical records. Many communities appear to have no documentation of when they ceased installing lead service lines altogether.
When records are absent, it can be difficult to determine with certainty if a pipe is lead without digging it up – and disturbing the pipe can release lead into the water.
The situation is complicated further by ownership issues: service lines are typically split between a public (i.e., utility) side – between the main in the street and the property line – and a private side – from the sidewalk water meter to the house. As a result, utilities often have more information on the public portion of the service line.
Most utilities address the problem of lead pipes by treating the water to build a protective coating on the inside of the pipes, preventing the leaching of lead. Referred to as corrosion control, this approach is important to reduce lead exposure, but not sufficient. It leaves large potential sources of lead in the ground, putting children at risk of exposure when the protective coatings fail. Thus, the best long-term fix is to remove the lead service lines.
But how do we fix a problem when we don’t know where it is? The first step is to begin to understand what we do know and make that information available to the public – thereby raising awareness, creating demand for more information, and allowing everyone to act.
DC Water Map
Last month, Washington, DC’s water utility (“DC Water”) made important strides to increase transparency when it launched a new, interactive map. The map populates nearly every building and public water source across DC – including residences, restaurants, retailers, schools, drinking water fountains, museums, and even the White House – with a color-coded circle indicating whether the both the public and private sides of the service line is lead, non-lead, or there’s no available information. The data are based on a combination of physical inspections, consumer reports, and historical data. As such, there is variability in the reliability of each data point, which DC Water is careful to disclose.
This nifty map is pretty fun to explore. Of course, the first thing I checked out was my house. The color-coded circle hovering over my narrow rowhome is a combination of white and green, signifying that while the public portion of my service line is not lead (green), there is no information for private side (white).
By clicking the circle, a pop up box provided the following additional detail on the public side: “Type: Copper; Description: Service pipe replaced 20051006.” I decided to investigative a bit further, reaching out to my landlord who confirmed that the public portion was replaced in 2005 during DC’s own “lead water crisis,” but had no additional information on the private side.
Despite the data gaps, the map allows DC residents to make their own informed decisions. Those who find a white, “no information,” circle can take steps to help understand if their home may have a lead pipe, through physical inspection or free testing provided by the city. Those with evidence of lead may pursue options to replace their lead pipes. Others may utilize temporary solutions, such as installing an NSF approved filter on their water faucet and always using cold tap water for drinking and cooking.
But it doesn’t stop there. DC residents can also check out their favorite restaurants, children’s daycare, or office building and make informed decisions to avoid lead exposure. I, for one, have started to skip a particular drinking fountain at one of my weekend hangouts after finding out that it is serviced by a lead pipe.
EPA’s call for transparency
On July 6th, EPA directed letters to state officials calling for improved public transparency and implementation of the federal rule on lead in drinking water. To that end, EPA encouraged states to make information available to the public, “The EPA believes that posting of individual sampling results is important for public transparency and intends to work with states that are not yet posting individual sample results – to share lessons learned form states that are already doing so, and to urge all states to adopt this practice.” EPA highlighted the use of online searchable databases of lead service line data as an effective method of increasing transparency. These letters come in response to resistance from many states following similar EPA letters sent to U.S. governors, state commissioners, and tribal leaders earlier this year.
We applaud DC Water for making what data they have available through this comprehensive inventory map. Access to these data is likely to both empower individuals to make informed decisions to reduce their exposure and increase their trust in the public utility. We hope that other cities and municipalities will look to DC as a model, and follow suit.Read more »
- Why 10,000 spills from oil and gas development can’t be ignored
Oil and gas development produces massive amounts of air and water pollution that can have severe impacts on our communities and ecosystems. And data in a recent investigative article could help us understand more about where and how much oil, wastewater, and other fluid are spilled across the country.
According to an EnergyWire article by Pamela King and Mike Soraghan, in 2015 industry reported more than 10,000 cases of spills across the country. That amounts to 42 million gallons of harmful fluids – 12 million gallons more than previously reported.
Spills can occur for a variety of reasons, like leaky infrastructure and sloppy operations. And their impacts can be devastating – particularly when it comes to spills of produced water. Chemicals in this wastewater can severely contaminate water resources, while the high salt content can wreak havoc on local ecosystems. For example, wastewater releases in the 1920s that spread across thousands of acres in West Texas created a vegetation dead zone visible from space. Almost 100 years later, state officials are still working to restore the area.
In a 2015 draft assessment, the Environmental Protection Agency acknowledged the threat of these spills, although the details of the full analysis were undercut by the report’s broader finding that hydraulic fracturing activities were not found to have “widespread, systemic” impacts on drinking water. But the thousands of recorded spills tell another story.
For this, and other reasons, EDF is among the growing chorus of voices, including the agency’s own Science Advisory Board, calling on EPA to clarify this conclusion. Oftentimes national statistics don’t reflect the severity of local problems, and by stating that spills were not “widespread” or “systemic” the EPA spurred misleading headlines about the overall “safety” of oil and gas development. It created confusion that ultimately gave industry an excuse to run business as usual, despite the existence of these well-documented impacts.
Case-in-point, in the EnergyWire article, a spokesperson with an industry lobbying group said the data reinforces the idea that current efforts to manage oil and gas spills are effective. But 10,000 spills clearly indicate something isn’t working.
We can do better.
Spills are not a new problem. They have been a constant issue for this industry for over a century. And while there has been growth in the development of spill-prevention technologies and practices, we should be capable of substantially reducing spills to much lower levels.
We need policies that standardize leading practices and encourage additional advancements to ensure that the pits, tanks, trucks, and pipelines that hold and transport oil, gas, and wastewater are properly constructed, maintained and operated. And we need better monitoring, reporting and remediation requirements so that when accidents do happen, we can address them more quickly and efficiently.
EDF is pushing for preventative regulations in oil and gas regions. Some states – and industry players – have made headway toward reducing spill rates and minimizing spill impacts. But as this most recent data indicates, there is room for improvement.
Oil and gas spills may not happen all the time and the may not happen everywhere. But even if spills are not a “widespread, systemic” problem – these and other local impacts matter to the affected communities. EPA and the industry must acknowledge that these impacts are real, so we can make greater progress on minimizing risks to our health and our environment.
Photo source: American Geophysical Union
- EPA issues first decisions mandated under the new TSCA
Richard Denison, Ph.D., is a Lead Senior Scientist.
Today, EPA posted on its website risk determinations for four new chemicals it has reviewed under the new standards prescribed by the Lautenberg Act. While the premanufacture notices (PMNs) for these chemicals were received by EPA prior to the June 22 signing of the new TSCA, EPA has reviewed them in the context of the new requirements. (Unlike reviews of chemicals already in use, which may take some years to conduct, EPA reviews of new chemicals are generally to be completed within 90 days, which is why we’re already seeing these appear so soon after enactment.)
These decisions are notable in that they are the very first formal decisions EPA has made under the new law. Based on an admittedly quick review of the decisions, I’ll offer a few observations.
A number of positive aspects of EPA’s review and posting are apparent:
- For the first time, an affirmative safety decision has been made by EPA for these new chemicals, reflecting a central reform of new chemical reviews that is required under the new law. In each case released today, EPA’s determination is that the chemical “is not likely to present an unreasonable risk.” (I expect EPA chose relatively “easy” cases for this debut.)
- These decisions have been made public, as required under the new law, which was a rarity under the old law.
- Summaries of the bases for the decisions have also been made public, both in the table provided on the website (see above link) and in 3-page “determination documents” for each substance. These documents provide, in one place, clear descriptions of:
- intended, known and reasonably foreseen uses of the chemical;
- a summary of EPA’s findings;
- descriptions of the criteria used and results of its analysis or consideration of:
- environmental fate
- bioaccumulation potential
- human health hazard
- environmental hazard
- potential exposures
- potentially exposed or susceptible subpopulations
This, in my view, constitutes an unprecedented level of transparency for a program that has often felt like a “black box” in the past.
That’s all good news. But there are other concerning features of the reviews that bear mention:
Confidential information. Substantial amounts of information were claimed confidential – and kept so by EPA, including:
- the identities of the companies submitting the PMNs
- the identities of the substances
- the specific uses of the substances (with one exception)
This is not atypical with PMNs, but such confidential business information (CBI) claims are now subject to a range of new requirements under the Lautenberg Act. It is not clear the extent to which such claims were scrutinized by EPA in these cases, and as the PMNs were submitted and the claims asserted prior to the signing of the new law, it’s not clear these particular claims would have been subject to the new requirements. Going forward, however, we will be seeking assurance that EPA is applying the new CBI requirements to applicable claims in PMNs and also reflecting them in its reviews and decision documents regarding those PMNs.
Estimated, not measured, data. Essentially all of the environmental fate, hazard and exposure information on which EPA based its decisions is predicted, not measured, and is derived using its suite of estimation models, structure-activity relationships or read-across. Use of such information is the traditional way in which EPA has conducted reviews in the past when companies fail to provide any or adequate safety data in their PMNs (true in a large majority of cases), and it may well not be adequate going forward as the basis for EPA to make the requisite affirmative safety finding.
While they can be useful, such estimation approaches and models have significant limitations, don’t work well for some classes of chemicals (e.g., certain heavily halogenated compounds), and aren’t available or reliable for some important health endpoints (especially chronic human health effects). Just one example is that EPA’s persistence models typically only examine a chemical’s persistence in air, water soil, and sediment, and its bioaccumulation model is based on uptake from water. Yet some chemicals (e.g., the “Teflon chemical” PFOA) that are “passed” by those models can nonetheless persist and build up in people’s blood.
In the absence of public knowledge of the identities of the chemicals, it’s difficult if not impossible to independently judge whether sole reliance on such information was adequate in these four specific reviews. And assuming these cases are relatively “easy ones,” they should not be viewed as representative, and the approach EPA deemed adequate here should not be viewed as precedent-setting.
A number of observers have called on companies to provide more safety information up front in their PMNs, in order to increase the likelihood of smooth reviews under the Lautenberg Act. We would echo that call and will be watching this issue closely going forward. In addition, the Lautenberg Act provides EPA with ready authority both to require testing in order to obtain adequate information on which to base its decisions, and to either block or condition commencement of manufacture pending receipt of that information.
Need for more than just summaries. As much as we welcome the greater transparency and extent of documentation of EPA’s review of these four cases, the determination documents EPA has provided are still only summaries and really only provide EPA’s interpretation of the information and analysis it has conducted. The PMN review process routinely generates more detailed review documents that provide its model inputs, outputs and assumptions. But in the past these have rarely been made public, and when they are available, they are heavily redacted due to CBI claims — in our view, overly so.
EDF urges EPA routinely to make its more detailed PMN review documents public, and to redact only information that is truly confidential.
Cursory consideration of exposure and exposed subpopulations. EPA’s reviews of exposure potential appear incomplete and insufficient, as well as its identification of potentially exposed or susceptible subpopulations. (The law now includes an explicit requirement that EPA account for and mitigate against unreasonable risks to such subpopulations.) In all four cases, EPA noted that exposure potential was not even estimated due to predicted low bioaccumulation and hazard potential. Among other issues, this amplifies the concern we have over EPA’s heavy reliance on limited estimation models for bioaccumulation and hazard, when that in turn leads EPA not to further consider exposure potential.
In addition, in at least one case the basis of EPA’s identification of workers as the only potentially exposed subpopulation seems wholly inadequate. That chemical’s intended use is listed (generically) as “plastic additive.” This begs many questions, for example: which plastic(s), for what uses? Is there potential for the additive to migrate out of the plastic, which could lead to all kinds of potential exposures to a variety of subpopulations? What about releases in post-use waste management of such plastics?
Going forward, EDF urges EPA to do a more thorough review of the exposure side of the risk equation.
So, these first decisions by EPA under the Lautenberg Act are welcome and appear to represent real progress in a number of respects. We recognize the pressure EPA was under not to unduly extend its reviews of PMNs that had been filed prior to the law’s passage. Going forward, however, and in dealing with less “easy” cases, EPA needs to do more to ensure and communicate that its review is based on information sufficient to make the affirmative finding the new law requires: whether each new chemical is or is not likely to present an unreasonable risk.Read more »