Comments from Aquatic Peer Input Panel
Miachel Rexrode Senior Aquatic Biologist
U.S. Environmental Protection Agency
Office of Pesticide Programs
Environmental Fate and Effects Division
June 14, 1999
On this Page
- Charge to ECOFRAM Workshop Panel Members
- Is the draft report scientifically sound?
- Did the ECOFRAM workgroup address the "Charge to the Terrestrial and Aquatic Workgroups" identified in the background document, "evaluating Ecological Risk: Developing FIFRA Probabilistic Tools and Processes"?
- What are the limitations for predicting risk using the approach described in the draft report?
- Taking into account your answers to the three questions above, what areas of the report need strengthened?
- At what point in the risk assessment process is the uncertainty level high enough to support the consideration of risk mitigation? What is the minimum level of technical information and scientific understanding that is necessary to evaluate whether risk mitigation would be necessary and/or effective?
- Other Comments
I appreciate the opportunity to provide comments on the ECOFRAM draft document and recognize the considerable effort and time that was expended by the committee in addressing the issues of aquatic risk assessment relative to pesticide exposure and effects. The Agencies current approach to aquatic effects analysis is based on a deterministic model that relies on the results of standard toxicological endpoints and environmental exposure values (computer generated, as well as, actual residues). The ideas presented in the ECOFRAM document reflect approaches to reducing some of the uncertainties associated with this method and appear to provide a constructive basic framework for a way to approach a probabilistic risk assessment. My comments are as follows:
A document of this length needs a comprehensive executive summary that highlights key issues and recommendations surrounding the risk assessment and possible testing requirements. Advantages and weaknesses of the current risk assessment, as well as, the proposed assessment process should be presented clearly. The document appears to be a collection of thoughts and interests and needs a more comprehensive integration and editing if it is to be useful to OPP. The document may be improved by reducing some of the redundancy (parts of Chapter 2 are repeated in Chapters 3 and 4), risk characterization should be a separate topic from the exposure and effects portions of the document, and a separate chapter on tools could include time-to-event analysis, population models, analysis of species sensitivity distribution, analysis of time-varying assessment, mirocosm/mesocosm, and field monitoring. An analysis of the proposed tools should also include the level of uncertainty inherent in the particular approach, the advantages and disadvantages, implementation cost (e.g. some of the population models were developed for terrestrial use. Can they be converted to evaluating aquatic populations? What degree of validation is necessary?).
The probabilistic theory of risk assessment has merit but may also result in misapplication if appropriate information is not available. I agree that a probabilistic approach can provide a better understanding of the degree of effects. However, in order to use this as a refinement to the current deterministic model, a more complete data base is essential. The assumption that a probabilistic analysis for freshwater ecosystem can be completed on data relevant to only 2-3 species (i.e. two fish and one daphnid) only increases the uncertainty. ECOFRAM's use of permethrin, a compound with a relatively large data base, as their example for probabilistic assessment does not represent the reallife situation that OPP is usually presented. The majority of registered pesticides do not have an extensive effects or exposure data base and newly registered chemicals usually have even less. The open literature may be helpful, but often the information is incomplete with regards to methodology and parameter selection (i.e. another source of uncertainty). If large data bases are available (e.g. copper compounds), there is the resource intensive task of evaluating each study for its adherence to a set of criteria, in order to maintain scientific consistency in the review process. How does ECOFRAM propose to address data sets that have bimodal distributions and outliers? ECOFRAM should present a more comprehensive evaluation of the strengths and limitations of their probabilistic approach with several examples, because the current proposal outlined for Tier I-II will undoubtably result in a new set of uncertainties. In order to refine the current risk assessment process through a probabilistic approach a more extensive data base is essential at Tier I-II.
A more comprehensive evaluation is needed regarding the amount of resource that may be needed to implement Tier III type of studies for industry and EPA. This increased complexity in study design will result in an exponential increase in rebuttal as to the interpretation of results. From a pragmatic view, how can EPA change the registration process so that this does not result in years of debate. Currently the registrant can rebut and debate or agree to do higher tier studies while their product remains in the market. There is no incentive for the registrant to bring the issue to closure.
The discussion on various population models appeared promising but the presentation of this information was not as organized as I expected. The following is suggested :
- Description of model;
- Model validation;
- Potential uses of model;
- Limitations of model;
- Recommendations (feasibility and cost);
ECOFRAM suggest that OPP can "refine the analysis of potential effects through more complete use of results of Tier I". However, ECOFRAM states that Tier II assessment will rely on the same data that is used in the deterministic risk assessment and that no new data will be required for this refinement. The assumption is that there will be adequate data available and that the data is in a form that can be used in a probabilistic assessment. I suggest that the core data requirements should be changed to reflect this new assessment approach. In general, time-to-effect can be included in standard acute and chronic testing. A representative number of different species should be evaluated (e.g. aquatic invertebrates : shredders, deposit feeders, predators, etc.; fish : benthic, forage, etc.). OPP can not extrapolate effects from one species (e.g. Daphnia) to several different species and expect to decrease the current level of uncertainty.
In Chapter 4 ECOFRAM states that recovery of an aquatic system will occur because of reintroduction of species, especially aquatic invertebrates. In principle, I can't disagree, however, the possible uncertainties that are associated with such a statement are not being addressed (i.e. the length of time for reintroduction, the diversity of the system, the indirect effects to other organisms, etc.). USFWS, 1994 - "It is ecologically indefensible and shortsighted to suggest recovery of a species without considering the other organisms that constitute the biological components of the ecosystem. Invertebrates are the glue that holds the aquatic ecosystem together." ECOFRAM does not consider the indirect effects of pesticide exposure to aquatic systems, only the direct effects. Simple models will not eliminate the uncertainties inherent in complex systems and complex models must be validated before they can be useful to OPP. ECOFRAM must clearly identify all assumptions and uncertainities, and the confidence that is associated with any proposed population models.
Charge to ECOFRAM Workshop Panel Members
All reviewers are asked to address the following questions:
Is the draft report scientifically sound? If not, please explain and provide specific suggestions on how to improve the report to make it scientifically sound.
The document as a preliminary framework for developing a Probabilistic Risk Assessment appears to be scientifically sound. Portions of the document could be strengthened as previously noted.
Did the ECOFRAM workgroup address the "Charge to the Terrestrial and Aquatic Workgroups" identified in the background document, "evaluating Ecological Risk: Developing FIFRA Probabilistic Tools and Processes"?
The charge to the workgroup was to develop a process and tools for predicting the magnitude and probabilities of adverse effects to non-target aquatic and terrestrial species from exposure of pesticides in the environment. Most of the method that were addressed by the workgroup are standardized procedures. However, most of the higher tier tools (i.e. population models) are not standard and need validation before they can be applicable for OPP's use. Exposure models were discussed but little discussion regarding model error was presented. A more comprehensive review of possible parameter error is also needed. It is not entirely clear if the methods discussed will predict the magnitude and probabilities of adverse effects.
What are the limitations for predicting risk using the approach described in the draft report?
Current data requirements are inadequate to support a probabilistic risk assessment as presented by ECOFRAM.. OPP defines risk by using acute and chronic data on 7 species of aquatic organisms relative to expected exposure to a particular pesticide. These toxicity points are expected to represent freshwater and marine/estuarine fish and invertebrates. However, this data may not provide a powerful enough analysis to estimate overall risk (SAP, 1996). There is a great need to better understand the functional relationship between the tools used to estimate effects, toxicity and exposure estimates, and the actual effects under field conditions. These relationships should be the foundation for any model, deterministic or probabilistic. Therefore, if refinements are to be achieved in the current risk assessment process, these uncertainties need to be better defined. According to the SAP, (1996) "in the absence of this research......the questions associated with present methodologies will persist even if more sophisticated methodologies are developed".
ECOFRAM is relying on exposure models in order to decrease the uncertainty that is inherent in the lower tier screening process. Refinement in the risk assessment is expected to be driven primarily through the use of "refined " exposure models. However, how reliable are these models? What are the levels of uncertainty inherent in the models? Is there a proposed program to validate these models? To discuss the precision of a prediction model, there should be comprehensive evaluation of structure and the parameters that are being utilized. Precision means a standard of robustness in the predictions of the model that the predictions are sufficiently accurate and that well-defined estimates of error for these predictions can be made. Evaluating various types of pesticides and use patterns through the systematic collection of field data is needed for an a better understanding of the limitations in the proposed exposure models. Without this level of validation, the risk assessment process will be hampered and refinement will not be achieved.
Taking into account your answers to the three questions above, what areas of the report need strengthened?
In general ECOFRAM only presented one method of evaluation. However, EPA needs more flexibility in conducting a comprehensive risk assessment and should have several methods to chose from relative to applicability, resource expenditure, and relative precision. Some decisions can be derived from direct interpretation of standard acute and chronic endpoints (standard guidelines are sufficient). Other decisions are evaluated subjectively, and the guidelines are merely a template constructed from the experience of scientist in this arena of work. However, consistency, documentation, planning and objective biological common sense underlie the successful implementation of any method that evaluates effects to aquatic organisms. The success of a risk assessment does not start with choosing methods. On the contrary, the probability of failure increases as the investigator's thinking becomes method rather than problem oriented.
ECOFRAM expects the Agency to complete a Tier I and II risk assessment/ characterization using a probabilistic distribution of acute toxicity values represented by 7 species. The possible effects on freshwater invertebrate populations will be estimated by evaluating population models on Daphnia, etc. This assumption is troubling (e.g. different toxicity slopes, lack of data). If the problem is evaluating risk to freshwater aquatic invertebrates, a selection process and planning regime must be implemented. In order to provide a broad ecological perspective the following must be identified:
- Sensitive species;
- Important species in nutrient cycling or energy flow;
- Species from a group that uses common environmental resources (guild);
- Life history tables;
- Models that evaluate species interaction and dependancy.
Relative to each problem formulation, the overall uncertainty surrounding particular parameters must be defined. For example: If we have a compound that is very toxic to 50% of the aquatic invertebrate, 10% of the time, what are the expected uncertainties that should be addressed? What are the a "key- stone" species in this particular system? To what degree will this effect alter the population structure and dynamics of other species that are dependent upon the population at risk (i.e. conditional indices for breeding fish, as well as, survival and growth of young-of-the year fingerlings). How representative is the surrogate species that is being modeled? The slope of the toxicity curve can be very different from one aquatic group to another (i.e. invertebrates vs. fish; if regression is used to evaluate toxicity distribution, a goodness-of-fit chi square must be initiated). If we are concerned in reducing uncertainty through the probabilistic approach, better utilization of the data should be considered. This may include the following (Richard Lee, 1999, personal communication) :
Use all available confidence intervals (CIs) for stochastic inference by transfering information from the CIs to standard normal distribution for calculation of probabilities;
RQ screening could be expressed as a range (relative to CIs);
Statistical analysis of hypothesis testing (e.g. T-test, ANOVA) can be strengthened with Power of Test (degree of confidence), alternative hypothesis testing, and normality and homoscedasticity testing. When running Probit or regression analysis, consideration should be given to Goodness of Fit (X2 - test) and positive regression response (T- test). The use of Moving Average method is questionable if the confidence intervals are infinitive.
Organization of document is a critical concern. Risk characterization is covered in a disjointed manner in Chapters 2, 3, and 4. Time-to-effects. This may give a more complete picture of the temporal effects of exposure. However testing protocols must be revised to accommodate this type of data collection (no discussion by ECOFRAM). Other suggestions include the following:
Effects to benthic species need to be better characterized through acute and chronic testing of fish (e.g. catfish, darters);
Acute and chronic sediment testing on representative invertebrates (e.g. Chironomus tentans, C. riparus, etc. );
Acute and chronic testing of major aquatic invertebrate guilds (e.g. shredders, deposit feeders, predators, etc.);
Acute and chronic testing of amphibians - probability that amphibians will be less sensitive than standard test organisms needs validation;
Dose/Response slopes are needed to characterize the hazard between chemicals or formulations;
How accurate are the current standard toxicity testing results. Should standard toxicity protocols be modified to better decrease the uncertainty surrounding a point estimate (e.g. three replicates vs. two replicates per dose would increase power of test).
At what point in the risk assessment process is the uncertainty level high enough to support the consideration of risk mitigation? What is the minimum level of technical information and scientific understanding that is necessary to evaluate whether risk mitigation would be necessary and/or effective?
After Tier II, mitigation options can be proposed. However, the perceived level of risk will be the determining factor in deciding the degree of mitigation. ECOFRAM suggest the use of jointprobability curves at this stage in deciding upon further actions. This can be a useful tool if there is a sufficient data base and if a threshold level has been accurately defined. Refinement in the effects and exposure area are only meaningful if they truly reduce uncertainty. The use of the 10th percentile argument can be an over simplistic interpretation of the possible risk that does not account for "key-stone" species, indirect effects to other species, etc.
OPP must define a level of uncertainty that it is willing to accept. As a biologist, I think that ecological diversity and environmental integrity are significant endpoints. If species are inadvertently impacted, what is the probability that stressed population(s) will be sustained? This is a complex question that needs a significant expenditure of resources and round-robin testing before and accurate picture is developed. Until this happens, the Agency should complete their risk assessments with appropriate safety factors.
p2-2 - 2-5 : A rationale for Probabilistic Assessment is not presented. There should be reference to the importance of adequately characterizing, quantitatively, the variability and uncertainty in fate, transport, exposure, and effects for ecological risk assessment. A secondary goal is to identify key sources of variability and uncertainty and to quantify the relative contribution of these sources to the overall variance and range of model results.
p2-5; line 22-23 : .......beginning with conservative assumptions and moving toward more realistic estimates. This statement can be enhanced by stating that a tiered regulatory process should be a mechanism for asking the question of whether a quantitative analysis of uncertainty and variability will improve the risk assessment. Have the weaknesses and strengths of the definitive method been evaluated? Will a quantitative estimate of uncertainty improve the decision?
p2-6 - 2-7 : Tier I and Tier II screening and temporal/spatial risk assessment can be combined as a first assessment (EFED is currently doing this).
p2-9; line 6; The selection of Tier 3 options is based on expert judgement..... Include a statement that doesn't appear arbitrary. Expert judgement based on information related to physical chemistry parameter, toxicity information, pesticide mode of action, pesticide use pattern and an understanding of the key sources of variability and key sources of uncertainty.
p2-12; line 14-18 : Risk Characterization and Assessment must be conducted by OPP scientists (EPA is the regulatory agency) and not the registrant. The registrant may rebut the Agencies conclusions if they have adequate information, but the Risk evaluation process is a task authorized through FIFRA for EPA to conduct.
p2-13 : table 2-1 : additional data are needed to reduce uncertainty........ Registrant's option : If registrant believes additional work is worthwhile, further risk assessment at a higher tier will be taken. Include the statement, "Product use abandoned or discontinued registration process stops".
p2-15 : Throughout this document it is assumed that progressive tiering serves to decrease the mean estimated exposure. However, no validation of this assumption has been presented. I know of examples where field data corroborated the expected values generated from PRZM/EXAMS (phosmet). In addition higher refinement (i.e. field residues) has also shown a pesticide to be a greater problem than originally noted (i.e. TBT). This chapter should show a more realistic balance.
p2-16 : While it is difficult to include all of the variables and their possible interactions which may be affected by the distribution of pesticide dosage, a general overview of the major uncertainty concerns should be included here.
p2-16; line 9-14 : After the initial risk assessment, ECOFRAM is suggesting that further refinement can be acheived exclusively through exposure modeling ("further refinement of effects data is not needed"). I disagree, refinement must occur on effects, as well as, exposure in order to develop a more accurate risk characterization. Simply driving down exposure values does not eliminate the uncertainty surrounding the possible effects to a variety of "key stone" species in the environment. The rough estimates from Tier I and II
p2-17 : EFED currently evaluates sediment/pore water exposure through PRZM/EXAMS. This is part of the deterministic point estimate.
p2-17 : Through-out the document there is mention that Tier I assessment reflects conservative exposure. This statement is misleading and not necessarily correct. Tier I evaluation uses GENEEC and, as further refinement, PRZM-EXAMS. I have noted that the output from PRZM3-EXAMS can produce values that are comparable to field values (i.e. phosmet). Better wording change "conservative exposure" to "computer generated ".
p2-18 : Of particular importance is the need to be able to define the relative "severity" of the use scenario in terms of a probability of aquatic exposure. This is incomplete. "Scenario severity" relative to some ecological threshold is more appropriate.
p2-18; line 10-12 : There are several examples where "real world" residue values match EFED's computer generated values by at least an order of magnitude.
p2-19 : freshwater species are not considered surrogates for estuarine species.
p4-5 : Extrapolation from the standard test organisms to a wider range of species.... apply a safety factor to the lowest LC50. Or empirical extrapolation coefficients from taxonomically related species. Or measure toxicity to a selected variety of species , assume that the sensitivity of species follows some probability distribution - estimate the distribution of sensitivity within a community of species.
*p4-5 : Water Quality Criteria development. Distribution analysis is a tool which could be used to determine a safe exposure level for a community based on data for a subset of species. This may be appropriate if you have a large data base. This is not the case with new chemicals (7 data pts at best).
p4-5 : mesocosm studies are useful if variability is addressed (i.e need to evaluate power, is there adequate replication?).
p4-5 Exposure as a peak or pulse - Yes, this is occurring but compounds can and do tend to concentrate in the sediment, especially in backwater areas where many fish reside.
Time-varying or repeat exposure : pharmacokinetic models with doseresponse... microcosm, mesocosm....Yes, this is useful but what are the references .
*p4-8 : Functional redundancy. This is a dangerous statement and generalizations should not be made. Many times the species that fill the available niche are less desirable as a functional component of that ecosystem (e.g. generalist) resulting in a decrease in diversity.
p4-8 : Risk management strategy to protect 90 - 95% of population. We do not have empirical data that explains what will really occur when a certain % of a population is reduced. The majority of pesticides are lipophilic and persistent (e.g. organophosphates, synthetic pyrethroids). The document states that the effects from non persistent pesticides are temporary. Again, we are generalizing. Does ECOFRAM have documentation to back-up this statement? What about the wide range of pesticides that are persistent?
p4-8 : SWACOM program to address the question of temporary effects of pesticides on aquatic invertebrate populations. Need model validation etc.
p4-8 : most aquatic invertebrates have short generation times and rapid rates of population growth, and are able to recover rapidly from population reduction. I have a big problem here. I don't think that the supporting assumptions are completely correct. EFOFRAM concludes that the functional role of the missing group could be transfered to other organisms that fill the existing niche. It is desirable to save 90% of the population but have we protected the ecosystem. Perhaps we should evaluate the probability distribution about the LC5 and not the LC50.
The other statements from line 20-28 are not always correct. A probabilistic assessment of each of these assertions should be developed before the statements can be used as a rational for population effects.
Sedentary species escape exposure because of their nonuniform distribution - What is the likelihood of population reintroduction (this is not a static system, other populations can be competing for the same niche.).
*Resistant stages in zooplankton and invertebrates ? What is that probabilistic distribution? Aquatic insect populations renew at least annually ? If the toxic exposure occurs at a time when a particular fish life stage is vulnerable and dependent upon this trophic level, there may be indirect effects (e.g. retarded growth).
*Define the natural fluctuation in abundance of aquatic invertebrate populations. What is the ecological significance of pesticide effects ? Is this temporal ? We must consider regional effects that take into account multiple applications, pulse-dose, as well as, the use of several pesticides on a particular crop. Need model validation (SWACOM) 4-9 ; line22 : ? Explain 4-9; There hasn't been that much emphasis placed on biomarkers. EPA doesn't use them to predict toxicity.
p4-66 - 4-69 : Mode of action information is paramount to identifying groups that are most sensitive vs those that may be adaptable. Sensitivity distributions must be calculated separately. However, even though certain groups (i.e. invertebrates) may be more sensitive than another group (i.e. fish), we should not focus only on the group that shows acute effects at the exclusion of the higher tier group that may be indirectly affect through trophic level alterations. Refinement must address interactive affects of populations that are at possible risk. Currently, I am concerned that the proposed probabilistic refinement using the same limited data as that used in Tier I risk characterization is not refining our understanding of potential risk. In order to construct sensitivity distributions, several species within each respective group must be tested.
p4-71 : Grouping for sensitivity, a distinction between midwater and benthic organisms must be made since sediments can serve as reservoirs for many organic compounds.
p4-80 : Shorter exposure produces less effects.... 2 day exposure at 2µg/L may cause the same effects as a 1-day exposure at 4µg/L. This is not a correct assumption that does not take into consideration toxicokinetics . Description of a time course of deposition (absorption, distribution, biotransformation, and excretion) of a test substance in an organism. The toxic response produced by chemicals is critically influenced by the rate of absorption, distribution, biotransformation and elimination.