March 1998 SAP Meeting Final Report
A Set of Scientific Issues Being Considered by the Agency in Connection with Suggested Probabilistic Risk Assessment Methodology for Evaluating Pesticides That Exhibit a Common Mechanism of Action.
The Federal Insecticide, Fungicide, and Rodenticide Act (FIFRA) Scientific Advisory Panel (SAP) has completed its review of the set of scientific issues being considered by the Agency in connection with Suggested Probabilistic Risk Assessment Methodology for Evaluating Pesticides That Exhibit a Common Mechanism of Action. The review was conducted in an open meeting held in Arlington, Virginia on March 24, 1998. The meeting was chaired by Dr. Ernest E. McConnell (ToxPath, Inc.). Other Panel Members present were: Dr. Julian Andelman (University of Pittsburgh), Dr. Charles Capen (The Ohio State University), Dr. Janice Chambers (Mississippi State University), Dr. Amira Eldefrawi (University of Maryland), Dr. Dale Hattis (Clark University), Dr. Ernest Hodgson (North Carolina State University), Dr. Bruce Hope (Oregon Department of Environmental Quality), Dr. Ron Kendall (Texas Tech University), Dr. Charles Menzie (Menzie-Cura and Associates), Dr. Robert Moore (University of Wisconsin), Dr. Herb Needleman (University of Pittsburgh) [recused], Dr. B.K. Nelson (NIOSH), Dr. Chris Portier (NIEHS), Dr. J. Routt Reigart (Medical University of South Carolina), Dr. Howard Rockett (University of Pittsburgh), Dr. Lawrence Sirinek (Ohio Environmental Protection Agency), Dr. Mary Anna Thrall (Colorado State University), and Dr. John Wargo (Yale University).
Public Notice of the meeting was published in the Federal Register on February 11, 1998.
Suggested Probabilistic Risk Assessment Methodology for Evaluating Pesticides That Exhibit a Common Mechanism of Action
Oral statements were received from the following:
Dr. Charles Benbrook, Consumers Union.
Dr. George Oliver and Dr. Joel Mattson, Dow Agrosciences.
Dr. Barbara Peterson and Dr. Charles Breckenridge, The Alliance for Reasonable Regulation of Insecticides.
Dr. Rudy Richardson, University of Michigan.
Dr. David Wallinga, Natural Resources Defense Council.
Written statements were received from the following:
American Crop Protection Association.
The Alliance for Reasonable Regulation of Insecticides.
The Office of Pesticide Programs (OPP) has requested the FIFRA SAP to review the Environmental Working Group (EWG) paper Suggested Probabilistic Risk Assessment Methodology for Evaluating Pesticides with a Common Mechanism of Toxicity: Organophosphate Case Study. OPP is seeking comment and advice from the Panel on what aspects of data use and process may be appropriate for inclusion in its continuing risk assessment methodology development process. Questions are provided below to focus the Panel s attention on those areas of particular interest to OPP.
General Comments from SAP Members
The consensus of the Panel indicated strengths and limitations in the probabilistic techniques presented. Thus, the Panel encourages continued development, testing and use of the probabilisitic methodology. In addition, several overall points were raised during the Agency presentation, public comment, Panel general discussion and response to Agency questions, which the Panel wished to add as issues concerning this session.
- Risk assessors generally replace non-detects with a value of « the limit of quantification (LOQ) to: (a) account for the possibility that some contamination is nonetheless present despite a non-detect and; (b) to provide a better estimate of the mean and standard deviation of a censored data set. Replacing non-detects with a value equal to « the LOQ provides a better estimate of the mean and standard deviation of normally distributed data than does setting a non-detect = 0, up to approximately 50% censoring.
- The assumption used in background document may not be as conservative as necessary. A balanced discussion is not provided of the way this method and model might over and/or under estimate risk. The Panel recommends that the document reflect the possibility for both over and/or under estimates.
- Population at-risk estimates should be refined by identifying and constructing models for a variety of appropriate subpopulations. It is possible the residue levels would be the same as those obtained with a general model for all subpopulations but the transparency and robustness of the analysis would be much improved.
- The dominance of point values (as opposed to range of values or "+/-" notations) in the document almost suggest that a deterministic, as opposed to a probabilistic risk assessment, was performed. Data displayed in this manner convey a sense of certainty about the results that is inappropriate for probabilistic risk assessment.
- The probabilistic model presented, similar to other models, should be as transparent as possible, including coding and operation. This position coincides with the Agency s Monte Carlo guidance of model transparency and documentation.
Agency Questions Posed to the Scientific Advisory Panel
- Does the Panel agree that high-end consumption over a short
term (e.g., monitored by USDA s one-day dietary recalls) adequately
represents long-term consumption? If not (and high-end consumption
over the short term is unlikely to be sustained over the long-term)
does the Panel believe that coupling a chronic toxicity value
with short-term consumption figure is inappropriate and leads
to an overestimation of resulting (short-term) risks?
The Panel differed whether high-end consumption over a short term adequately represents long-term consumption. Several Panel Members concluded that it is inappropriate to assume long term stability (i.e. high within-person correlation) in long term consumption of different foods. How much within-person correlation there is in the consumption of specific types of foods is an important empirical question that should be answered with specifically collected data. There is good reason to expect some degree of both short-and long-term correlation in individuals' food consumption habits, and the specific residue levels that may be encountered in particular batches of produce that are repeatedly sampled. At least some preliminary information on short term within-person correlations in consumption habits could be gleaned, not by treating the individual day s observations in the data base as presented, but by calculation of rank correlations or absolute consumption correlation for the same individual observed on adjacent days or observed for three days running, if possible. Given rank correlations for consumption on adjacent days, these could be directly built in to the Monte Carlo simulation models to see how the high tails of the distributions were reduced for averaging periods that are deemed relevant for different kinds of toxicological effects dependent on cholinesterase inhibition. Those averaging periods are nowhere near as long as a lifetime, but may be a few days to a few weeks, depending on the dynamics of reversal of the cholinesterase inhibition at specific sites in the nervous system and for irreversible (e.g. organophosphate) and reversible (e.g. carbamate) inhibitors. A stronger analysis therefore depends on integrating this kind of biological information (to define the appropriate averaging time) with information from better statistical information on the stability of consumption and residue level ranking over the relevant period.
Exposure profiles should be developed that capture more realistic longer-term estimates of exposure in other aspects. This can be accomplished by aggregating the available data in various ways including working with the two to four-day dietary studies. Use of microexposure modeling approaches would also be helpful for developing distributions of exposure in populations.
High end consumption patterns, combined with high end residue concentrations, are likely to cause the greatest hazard associated with specific pesticide uses. Estimating the frequency of these high end combinations, for the toxicologically appropriate averaging times (single-meal in some cases; several days to a few weeks in others), is critical for quantitative risk assessment purposes.
However, other Panel Members commented that the coupling of a chronic toxicity factor with short-term intake estimates does not pose a conceptual problem. A chronic pattern of childhood exposure to organophosphate residues appears to occur; and person-day intake distributions are suitable to estimate the distribution of chronic exposure. The short-term intake estimates are limited for all of the reasons raised above; however, no other data will provide more reliable estimates than the most current intake data sets (despite their failure to capture longitudinal variability). In addition, the Agency should not become too concerned about the short-term intake surveys, or "high end" consumption patterns. Instead, more important concerns include the representativeness of the survey, its suitability for estimating pesticide exposure, the sample size of susceptible groups, and the Agency s expert judgment that conditions in the future may be well-predicted by the behavior surveyed.
The Panel recommends that the Agency aggregate food intake across single day eating occasions for individuals. These data may then become the foundation for both acute and chronic exposure assessments. Person-day intake values should be expressed as distributions within age groups, preferably in 3 or 4 month intervals between birth and age one, and then in annual groups between the ages of 1 and 18.
Existing food intake data sets do not capture longitudinal variations of intake within individuals, or among individuals across more than 3-6 days. Thus, the suitability of existing data sets to estimate long term patterns of intake and exposure are quite limited. Variation in intake over time may result from changing tastes, new marketing techniques, new food processing technologies, packaging innovation and other variables - all difficult to forecast. Thus, the Agency is left with the need to rely on best available evidence to predict future patterns of exposure, even if it fails to capture variability across time. The Agency should also carefully examine statistical techniques to better understand the relative contribution of "within person" versus "between-person" variability, to total variance. These approaches have been well-developed to examine the suitability of sampling designs for estimating nutrient intake. Certainly, the Agency should collaborate closely with USDA and FDA to produce the best current estimates of food intake for age groupings believed to be a special risk (e.g. children within the age classes described above). It is especially important to capture intake variance among children. However, this will require substantially larger sample sizes than are currently available so that regional, seasonal, and demographic variability may be better understood.
The Agency s goal should be to capture the variability of food intake and residue levels with the highest possible accuracy, and to combine these data to produce exposure distributions. Data sets and models should be structured to produce person-day exposure distributions that may be used for both acute and chronic estimates. However, the Panel realizes that person-day exposure distributions are not the only method of summarizing and analyzing data.
For chronic estimates, the Agency may wish to employ summary statistics to aggregate exposures across foods and chemicals, recognizing reasonable outer-bound daily caloric intake limits. The Agency s interpretation of chronic estimates should be tempered by knowledge of the failure of the intake estimates to capture longitudinal variability. This uncertainty provides a logic for assuming wider future variability than current data sets suggest, and may support the retention of the FQPA 10x safety factor.
The EWG use of reference doses (RfDs) caused the uncertainty factors associated with the individual chemicals to play an important role in determining individual children at risk, defined as those believed to exceed the RfD. Thus, as the uncertainty factor increased for any single chemical, its importance in contributing to risk of the mixture increased. The Panel suggests that the Agency consider alternative methods to estimate the probability of adverse health effects from exposure to mixtures of chemicals, such as organophosphates, that are believed to act via a similar mechanism of action. The Panel believes that it is important to further develop methods to account for the potential effects from exposure to mixtures that act via a common mechanism, and to account for uncertainty in the relative potencies of individual chemicals in a manner that does not bias the exposure estimate. This might be accomplished by conducting sensitivity analyses of those "at risk" (i.e. those exceeding the RfD). Rather than embedding these uncertainty factors in the exposure estimate, greater variability in uncertainty might be assumed in a probabilistic manner following the production of the exposure distribution.
The EWG use of residue data reflecting conditions where multiple chemical residues appeared on single food samples is innovative in that it accounts for individual exposure to mixtures at the "person-day" unit of analysis. This is not an endorsement of the method used by EWG to adjust for toxic equivalence; however it demonstrates that the Agency should be cautious to identify the potential for individual servings that result in a child's exposure to a mixture of chemicals that may act via a similar mechanism.
- The Panel is asked to respond to the following questions
concerning FDA and PDP monitoring data.
a) Recognizing that FDA and USDA Pesticide Data Program (PDP) sample from two different populations - FDA samples crops from geographic locations and times in which agricultural commodities are more likely to have been treated while PDP samples randomly throughout the year in proportion to distribution volume -- is this correction process scientifically defensible and reasonable for regulatory purposes or is it believed that attributing the entire difference in residue levels to at-home processing incorrect?
The majority of the Panel concluded that it is not possible to determine the validity of the reduction factors employed by EWG given the documentation provided; this reflects EWG s judgment. However, a Panel Member commented that the correction process would tend to overestimate the degree of reduction in residue levels that are attributable to home processing. This is expected based on the analysis presented indicating that there is some potential for a significant problem and suggesting that this procedure generates underestimates of the likely incidence of residue levels of concern (holding other aspects of the analysis constant).
The Agency should provide guidance regarding when and how residues should best be employed to reduce exposure estimates. Default assumptions and methods will be different for chronic exposure estimates than for acute exposure estimates. The guiding principle for acute exposure estimates should be to capture the potential for maximal exposures with accuracy, and to understand the factors that contribute to these conditions. If the Agency has credible data that it believes accurately capture the reduction or concentration effects of processing, washing, peeling, cooking etc., then it may choose to employ these data to reduce or heighten exposure estimates on a case by case basis. Since these data often may not be available, Monte Carlo techniques may be used to incorporate uncertainty in residue levels within exposure estimates.
b) If this correction process is defensible and reasonable, should this process be routinely applied to adjust FDA or other monitoring data to an "as-consumed" basis?
As noted previously, the majority of the Panel could not determine whether the correction process is defensible and reasonable. A detailed response is provided in part A of this question.
c) Should these residue reduction factors be extended to be routinely applied to data from experimental field trials in which maximum application rates are applied and minimum pre-harvest intervals are observed or should the Agency continue to require that specific data be submitted which supports the assumed reductions?
In the absence of credible evidence, the Agency should assume that pesticides have been applied at the maximum allowable application rate and account for infrequent higher application rates than label restrictions and infrequent shorter pre-harvest intervals than required. Additional thought should be given to provide a reasonable definition of the term "infrequent". These conservative "default" assumptions should account for human error and judgement not uncommon given the pesticide poisoning data collected and interpreted by the Agency. These assumptions should be relied upon only in the face of credible, recent, and replicated studies.
A risk assessment should, to the extent practicable, consider "real" world" factors that either reduce or increase exposure to contaminants so as to produce the most reasonable and representative risk assessment. The appropriate use of reduction factors is consistent with this goal. The Agency should continue to calculate and utilize reduction factors as outlined in the background document, unless user-supplied or other data indicate that higher or lower reduction factors should be applied. If such information is available, variables relating to where such reduction takes place (during processing, at-home, etc.) should be incorporated into the model. The extent to which these reduction factors influence the results of the probabilistic risk assessment should be quantitatively evaluated via sensitivity analysis, a general condition of the Agency s Monte Carlo guidance.
- For cholinesterase inhibition, is lack of knowledge of the
variability within single units making up a composite sample likely
to significantly impact the results and interpretation of the
risk assessment? Would this be true for other types of endpoints?
Pesticide residues in foods generally follow distribution which have heavy tails (like the lognormal distribution). Thus, blending portions of single food samples will normally yield a residue concentration less than the most contaminated piece. The only conditions under which this would be unlikely to occur would be if residues were uniformly distributed among the units sampled. Thus "compositing" will often result in an underestimation of residue levels on individual servings, unless care is taken to track the source of contamination following any detection in the composited sample. It is possible that compositing and blending have the effect of reducing any residue levels beneath the limit of detection. Some commodities, such as fruit juice concentrates, are normally combinations of foods from different sources. For these foods, the assumption of mixing, simulated by compositing, may be appropriate.
The problem posed by using composite samples to estimate exposure may be generic, i.e. not endpoint specific. In other words, dilution of residues will result in an underestimation of exposure and therefore risk from single servings of "simple" foods (i.e. foods that are not blended from other foods). This is especially the case if the act of compositing pushes the concentration of residue beneath the limit of detection. This could have a significant effect on a risk estimate under a number of conditions (e.g. if a chemical has a high potency, or if the food is highly consumed). Acute endpoints, under the assumption that consumption of a single "hot" serving could induce a severe health, response would be missed using composite samples.
The variability in single meals (and with persistent resampling of the same batch of produce by consumers), is indeed likely to be understated by observations of the variability of residue levels among larger composites. This will particularly affect estimates of acute toxic risks, as the larger variability in single-meal residue levels will mean a larger proportion of cases where acute toxic thresholds for individuals are exceeded. There may well be a similar problem in interpreting information for teratogenesis assessments, as often it happens that developmental effects depend on exposures that occur within a very narrow window (i.e. hours or a few days) when specific differentiation, cell signaling or migration events are occurring.
The Panel provided similar comments on composite samples for protecting against single-day exposures, as noted in their response to the session Policy for Review of Monte Carlo Analyses for Dietary and Residential Exposure Scenarios (as discussed at a session of this meeting of the SAP). If the Agency is protecting against single-day exposures, then it would be inappropriate to utilize composite sample for evaluating acute risks. The appropriate dose-response effect to be used in such cases is that related to one-day exposures. Alternatively, if toxicological effects are known to be manifested within a day and residence in the body less than 24 hours, then acute exposure should be evaluated against acute dose-response effects. However, if the dose-response relationship that is being used is based on an extended exposure (many days or months), then the one-day exposure event may not be the relevant exposure scenario (although it is more likely to show risk and therefore be more protective ). In such cases, a short-term (a few days or weeks) average exposure may be more relevant. If exposure on the order of days to weeks is more relevant for estimating exposure than are single day exposure, then composite data (including monitoring data) are appropriate for use.
FOR THE CHAIRPERSON:
Certified as an accurate report of findings:
Paul I. Lewis
Designated Federal Official
FIFRA/Scientific Advisory Panel