High Throughput Testing-The NRC Vision, The Challenge of Elucidating Real Changes in Biological Systems, and the Reality of Low Throughput Environmental Health Decision-Making - PowerPoint PPT Presentation


Title: High Throughput Testing-The NRC Vision, The Challenge of Elucidating Real Changes in Biological Systems, and the Reality of Low Throughput Environmental Health Decision-Making


1
High Throughput Testing-The NRC Vision, The
Challenge of Elucidating Real Changes in
Biological Systems, and the Reality of Low
Throughput Environmental Health Decision-Making
  • Dale Hattis
  • George Perkins Marsh Institute
  • Clark University

2
Outline
  • The Older NRC Vision Based on High-Throughput
    Testing and Safety-Type Risk Analysis
  • One Goal of This Talk is to Illuminate
    Alternatives
  • An Alternative Vision for Toxicology Based on
  • Quantitative Modeling of Homeostatic Biological
    Systems, and Problems in Their Early Life Setup
    and Late Life Breakdown
  • Interactions of Toxicant Actions and Normal
    Background Chronic Pathological Processes
  • Variability/Uncertainty Analysis Transforming
    Current Risk Analyses Used for Standard Setting

3
The Older NRC Vision Based on High Throughput
Testing Results
  • Decades-long period for future development.
  • Ensemble of high-throughput assays to represent a
    large number (100) of toxicity pathways.
  • Well adapted to rapid screening of new chemicals
    and old chemicals with no prior testing.
  • Supports decisions on what concentrations of
    agents will sufficiently perturb specific
    pathways to be of concern.
  • Relate concentrations causing those perturbations
    to in vivo concentrations using
    physiologically-based pharmacokinetic modeling.
  • No quantitative assessment of health risks or
    benefits of exposure reductions for existing
    agents.

4
Traditional Toxicological Model Leading To
General Expectations of Thresholds in Dose
Response Relationships for Toxicants
  • Biological systems have layers on layers of
    homeostatic processes (think of a thermostat)
  • Any perturbation automatically gives rise to
    offsetting processes that, up to a point, keep
    the system functioning without long term damage.
  • After any perturbation that does not lead to
    serious effects, the system returns completely to
    the status quo before the perturbation.

5
Caveats--There might not be no-adverse-effect
thresholds for
  • Tasks that challenge the maximum capacities of
    the organism to perform (e.g., 100 yard dash
    perhaps learning to read)
  • Circumstances where some pathological process has
    already used up all the reserve capacity of the
    organism to respond to an additional challenge
    without additional damage (e.g. infarction causes
    heart muscle cell death that may be marginally
    worsened by incremental exposure to carbon
    monoxide)

6
Other Caveats
  • The ground state of the system is not a stable
    equilibrium, but a series of cyclic changes on
    different time scales (e.g. cell cycle diurnal,
    monthly).
  • Sometimes continuous vs pulsatile patterns of
    change carry important signaling information in
    biological systems (e.g. growth hormone signaling
    for the sex-dependent pattern of P450 expression)
  • Therefore there must be resonance systems that
    respond to cyclic fluctuations of the right
    period.
  • (Think of the timing needed to push a child on a
    swing)
  • Therefore the effective dose may need to be
    modified by the periodicity to model dose
    response relationships.

7
Potential Paradigm Change for Applications to
Therapeutics
  • Historical Paradigm--The Magic Bullet
  • Find a molecule that will kill the nasty bacteria
  • Find a spot in the brain that, if electrically
    stimulated, will control Parkinsons disease
    symptoms
  • New Paradigm--Understand and exploit natural
    resonances to enhance or damp system oscillations
  • Potential for pulsatile systems for drug release
  • Potential for sensor-based systems for drug
    release (e.g., smart pumps that release insulin
    in response to real time measurements of blood
    glucose)
  • Potential for timed or sensor-based electrical
    stimulation of target tissues (e.g., heart
    pacemakers)

8
Key Idea for Transforming Risk Assessment for
Traditional Toxic Effects
  • Quantitatively characterize each uncertainty
    (including those currently represented by
    uncertainty factors) by reducing it to an
    observable variability among putatively analogous
    cases.
  • Human interindividual variability--kinetic and
    dynamic
  • Variation in sensitivity between humans and test
    species
  • Adjustment for short- vs. longer periods of
    dosing and observation
  • Adjustment for database deficiencies (e.g.
    missing repro/developmental studies)
  • This general approach is not without
    difficultyneed rules for making the analogies
    (defining the reference groups to derive
    uncertainty distributions for particular cases).
  • However it does provide a way forward for health
    scientists to learn to reason quantitatively from
    available evidence relevant to specific
    uncertainties.

9
Examples of Data Bases Assembled/Analyzed
10
Additional Data Bases Analyzed And/Or Assembled
Type of Projection Parameters Original Authors
Interspecies Sensitivity--Multi-Dose and Carcinogenesis Human Maximum Tolerated Dose and Putative Animal Equivalents for 61 Anti-Cancer Agents Price et al., 2001 Hattis et al., 2002
Pharmacokinetics in Pregnancy--Parameters Derived from PBPK Model Fits or Direct Observations Fetal Growth, Placental/Fetal Transfer, Maternal Tissue Growth, Partition Coefficients Hattis, 2004 Report to EPA
Adult/Early Life Stage Carcinogenic Animal Bioassay Sensitivity for 9 Mutagenic Agents Ionizing Radiation Cancer Transformations Per PPM, per dose/ (body weight) 0.75 or per rem ionizing radiation EPA (2005) cancer guidelines Hattis et al., (2004, 2005)
11
The Straw Man Quantitative Probabilistic
Framework for Traditional Individual Threshold
Modes of Action
  • It is ultimately hopeless to try to fairly and
    accurately represent the compounding effects of
    multiple sources of uncertainty as a series of
    point estimate uncertainty factors.
  • Distributional treatments are possible in part by
    creating reference data sets to represent the
    prior experience in evaluating each type of
    uncertainty.

12
Interpretation of Dose Response Information for
Quantal Effects in Terms of a Lognormal
Distribution of Individual Threshold Doses
13
Analytical Approach for Putative Individual
Threshold-Type Effects
  • Select Point of Departure (ED50), then define
    needed distributional adjustments
  • LOAEL to ED50 or NOAEL to ED50
  • Acute/chronic
  • Animal to human
  • Human variability, taking into account the
    organ/body system affected and the severity of
    the response
  • Incompleteness of the data base

14
Elements of the Straw Man Proposal--Tentatively
it is suggested that the RfD be the lower (more
restrictive) value of
  • (A) The daily dose rate that is expected (with
    95 confidence) to produce less than 1/100,000
    excess incidence over background of a minimally
    adverse response in a standard general population
    of mixed ages and genders, or
  • (B) The daily dose rate that is expected (with
    95 confidence) to produce less than a 1/1,000
    excess incidence over background of a minimally
    adverse response in a definable sensitive
    subpopulation.

15
Results of Application of the Straw Man Analysis
to18 Randomly-Selected RfDs from IRIS
16
Recent Results from Application of the Straw
Man Approach to Value of Information Testing of
the IPCS Data-Derived Uncertainty Factor Formulae
  • Split of PD/PK variability should be closer to
    52, rather than 3.13.1 as implied by the IPCS
    proposal (if one wishes the product to multiply
    out to the traditional 10-fold uncertainty factor
    assigned for interindividual variability
  • Approximately equal protectiveness would be
    achieved by substitution of the following values
    for the interindividual variability factor 10 for
    RfDs with the following characteristics Quan
    tal Endpoint Continuous EndpointOverall
    UF 100 17
    43
  • Overall UF 1000 7.4
    19

17
More Details of Our Analysis are Available in
  • Hattis, D., Baird, S., and Goble, R. A Straw Man
    Proposal for a Quantitative Definition of the
    RfD, Drug and Chemical Toxicology, 25 403-436,
    (2002).
  • Hattis, D. and Lynch, M. K. Empirically Observed
    Distributions of Pharmacokinetic and
    Pharmacodynamic Variability in HumansImplications
    for the Derivation of Single Point Component
    Uncertainty Factors Providing Equivalent
    Protection as Existing RfDs. In Toxicokinetics
    in Risk Assessment, J. C. Lipscomb and E. V.
    Ohanian, eds., Informa Healthcare USA, Inc.,
    2007, pp. 69-93.
  • Detailed Data Bases and Distributional Analysis
    Spreadsheetshttp//www2.clarku/edu/faculty/dhatt
    is

18
Implications for Information Inputs to Risk
Management Decision-Making
  • Increasingly cases such as airborne particles,
    ozone, and lead are forcing the recognition that
    even for non-cancer effects, some finite rates of
    adverse effects will remain after implementation
    of reasonably feasible control measures.
  • Societal reverence for life and health means
    doing the very best we can with available
    resources to reduce these effects.
  • This means that responsible social
    decision-making requires estimates of how many
    people are likely to get how much risk (for
    effects of specific degrees of severity) with
    what degree of confidencein cases where highly
    resource-intensive protective measures are among
    the policy options.
  • The traditional multiple-single-point uncertainty
    factor system cannot yield estimates of health
    protection benefits that can be juxtaposed with
    the costs of health protection measures.

19
Alternative Vision2--Multiple Directions for
Improvement for Toxicology and Risk Assessment
  • New Pharmacodynamic Taxonomies and Approaches to
    Quantification
  • Taxonomy based on what the agent is doing to the
    organism
  • Taxonomy based on what the organism is trying to
    accomplish and how agents can help screw it up
  • Quantitative Probabilistic Framework for
    Traditional Individual Threshold Modes of
    Action

20
Taxonomy Based on What Organisms Need to
Accomplish to Develop and Maintain Functioning,
and What Can Go Wrong
  • Establishment and Maintenance of Homeostatic
    Systems at Different Scales of Distance, Time,
    Biological Functions, Involving
  • Sensors of Key Parameters to Be Controlled
  • Criteria (E.g. Set Points) for Evaluating
    Desirability of Current State
  • Effector Systems That Act to Restore Desirable
    State With Graded Responses to Departures
    Detected by the Sensors
  • Some Examples of Perturbations
  • Hormonal Imprinting by Early-Life Exposure to
    Hormone Agonists
  • The Tax Theory of General Toxicant Influences
    on Fetal Growth, and Possible Consequences

21
Key Challenges for Biology in the 21st Century
  • How exactly are the set points set?
  • How does the system determine how, and how
    vigorously to respond to various degrees of
    departure from specific set points?
  • Could all this possibly be directly coded in the
    genome?
  • Or, more interestingly, does the genome somehow
    bring about a learning procedure where, during
    development, the system learns what values of
    set points and modes/degrees of response work
    best using some set of internal scoring system?
  • How exactly are the set points, etc., adjusted to
    meet the challenges of different circumstances
    (allostasis states--see Shulkin 2003,
    Rethinking Homeostasis).

22
(No Transcript)
23
(No Transcript)
24
(No Transcript)
25
(No Transcript)
26
(No Transcript)
27
Taxonomy Built from the Fundamental Ways that
Chemicals Act to Perturb Biological Systems
  • For preclinical stages or subclinical levels of
    effect, is the basic action reversible or
    irreversible, given a subsequent period of no
    exposure?
  • Reversible (enzyme inhibition receptor
    activation or inactivation--Traditional Acute or
    Chronic Toxicity--traditional toxicology/homeostat
    ic system overwhelming framework (individual
    thresholds for response)
  • Irreversible (DNA mutation destruction of
    alveolar septa destruction of most neurons)

28
Subcategories for Nontraditional (Based on
Irreversible Changes) Modes of Action
  • How many irreversible steps are needed to produce
    clinically recognizable disease?
  • (few--up to a dozen or so) ?molecular biological
    diseases--mutations, cancer via mutagenic
    mechanisms
  • (many--generally thousands) ?chronic cumulative
    diseases (emphysema and other chronic lung
    diseases cause by cumulative loss of lung
    structures or scarring, Parkinsons and other
    chronic neurological diseases produced by
    cumulative losses of neurons)

29
Special Features of Chronic Cumulative Disease
Processes
  • Clinical consequences depend on the number of
    irreversible steps that have occurred in
    different people (often little detectable change
    until a large number of steps have occurred).
  • Effects occur as shifts in population
    distributions of function.
  • Thresholds for the causation of individual damage
    steps must be low enough that the disease
    progresses with time in the absence of exposures
    that cause acute symptoms.
  • Different kinds of biomarkers needed for
  • Accumulated amount of damage/dysfunction (e.g.
    FEV1)
  • (most powerful for epidemiology based on
    associations with short term measurements of
    exposure) Todays addition to the cumulative
    store of damage (e.g.
  • excretion of breakdown products for lung
    structural proteins
  • blood or urine levels of tissue-specific proteins
    usually found only inside specific types of cells
    such as heart-specific creatinine kinase for
    measurement of heart cell loss due to infarctions)

30
Toward Risk Assessment Models for Chronic
Cumulative Pathological Processes
  • Describe the fundamental mechanism(s) that causes
    the accumulation of the individual damage events
    (especially the quantitative significance of
    various contributory factors). Key
    aid--biomarkers of the daily progress of damage
    (e.g. key enzyme released from a dying neuron of
    the specific type involved in the disease)
  • Quantitatively elucidate the ways in which
    specific environmental agents enhance the
    production of or prevent the repair of individual
    damage events
  • Describe the relationships between the numbers,
    types, and physical distribution of individual
    damage events and the loss of biological function
    or clinical illness. Key aid--biomarkers for the
    accumulation of past damage, e.g. FEV1.

31
Motivation to Move On
  • Younger generation of analysts will ultimately
    not tolerate older procedures that fail to
    provide a coherent way to use distributional
    information that is clearly relevant to the
    factual and policy issues.
  • Younger generation of analysts will have greater
    mathematical and computational facility,
    particularly as biology becomes quantitative
    systems biology with quantitative feedback
    modeling--increasingly an applied
    engineering-like discipline.
  • Legal processes will ultimately demand use of the
    best science as this becomes recognized in the
    technical community.
  • Newer information/communication tools will foster
    increasing habits and demands for democratic
    accountability experts worldwide will
    increasingly be required to expose the bases of
    their policy recommendationsleaving less room
    for behind-the-scenes exercise of old boy
    safety factor judgments.

32
Contributions of High-Throughput Testing in
Different Decision Contexts
  • Preliminary evaluation of large numbers of new,
    and old but untested, environmental agents--good
    promise to be helpful.
  • Evaluation of contaminated sites (e.g.
    superfund)--support for decision-making based on
    complex mixtures is highly dubious.
  • Assistance to epidemiological research to locate
    contributors for human disease (e.g.
    asthma)--also doubtful.
  • Assessment of relative risks of different
    industrial processes--fanciful because of the
    need to guess the relative weights to be assigned
    to numerous dissimilar findings on short term
    tests without established quantitative
    connections to adverse health effects.
  • High profile choices of degrees of
    control/exposure reduction to be mandated for
    specific agents (slow throughput
    decision-making)--Likely to muddy the waters by
    raising difficult mode of action questions that
    can only be resolved by expensive and slow in
    vivo testing--e.g. using knockout mice. This
    is in fact the most likely near term
    contribution.
View by Category
About This Presentation
Title:

High Throughput Testing-The NRC Vision, The Challenge of Elucidating Real Changes in Biological Systems, and the Reality of Low Throughput Environmental Health Decision-Making

Description:

High Throughput Testing-The NRC Vision, The Challenge of Elucidating Real ... Interspecies Sensitivity--Multi-Dose and Carcinogenesis ... – PowerPoint PPT presentation

Number of Views:37
Avg rating:3.0/5.0
Slides: 33
Provided by: daleha6
Learn more at: http://www.sra-ne.org
Category:

less

Write a Comment
User Comments (0)
Transcript and Presenter's Notes

Title: High Throughput Testing-The NRC Vision, The Challenge of Elucidating Real Changes in Biological Systems, and the Reality of Low Throughput Environmental Health Decision-Making


1
High Throughput Testing-The NRC Vision, The
Challenge of Elucidating Real Changes in
Biological Systems, and the Reality of Low
Throughput Environmental Health Decision-Making
  • Dale Hattis
  • George Perkins Marsh Institute
  • Clark University

2
Outline
  • The Older NRC Vision Based on High-Throughput
    Testing and Safety-Type Risk Analysis
  • One Goal of This Talk is to Illuminate
    Alternatives
  • An Alternative Vision for Toxicology Based on
  • Quantitative Modeling of Homeostatic Biological
    Systems, and Problems in Their Early Life Setup
    and Late Life Breakdown
  • Interactions of Toxicant Actions and Normal
    Background Chronic Pathological Processes
  • Variability/Uncertainty Analysis Transforming
    Current Risk Analyses Used for Standard Setting

3
The Older NRC Vision Based on High Throughput
Testing Results
  • Decades-long period for future development.
  • Ensemble of high-throughput assays to represent a
    large number (100) of toxicity pathways.
  • Well adapted to rapid screening of new chemicals
    and old chemicals with no prior testing.
  • Supports decisions on what concentrations of
    agents will sufficiently perturb specific
    pathways to be of concern.
  • Relate concentrations causing those perturbations
    to in vivo concentrations using
    physiologically-based pharmacokinetic modeling.
  • No quantitative assessment of health risks or
    benefits of exposure reductions for existing
    agents.

4
Traditional Toxicological Model Leading To
General Expectations of Thresholds in Dose
Response Relationships for Toxicants
  • Biological systems have layers on layers of
    homeostatic processes (think of a thermostat)
  • Any perturbation automatically gives rise to
    offsetting processes that, up to a point, keep
    the system functioning without long term damage.
  • After any perturbation that does not lead to
    serious effects, the system returns completely to
    the status quo before the perturbation.

5
Caveats--There might not be no-adverse-effect
thresholds for
  • Tasks that challenge the maximum capacities of
    the organism to perform (e.g., 100 yard dash
    perhaps learning to read)
  • Circumstances where some pathological process has
    already used up all the reserve capacity of the
    organism to respond to an additional challenge
    without additional damage (e.g. infarction causes
    heart muscle cell death that may be marginally
    worsened by incremental exposure to carbon
    monoxide)

6
Other Caveats
  • The ground state of the system is not a stable
    equilibrium, but a series of cyclic changes on
    different time scales (e.g. cell cycle diurnal,
    monthly).
  • Sometimes continuous vs pulsatile patterns of
    change carry important signaling information in
    biological systems (e.g. growth hormone signaling
    for the sex-dependent pattern of P450 expression)
  • Therefore there must be resonance systems that
    respond to cyclic fluctuations of the right
    period.
  • (Think of the timing needed to push a child on a
    swing)
  • Therefore the effective dose may need to be
    modified by the periodicity to model dose
    response relationships.

7
Potential Paradigm Change for Applications to
Therapeutics
  • Historical Paradigm--The Magic Bullet
  • Find a molecule that will kill the nasty bacteria
  • Find a spot in the brain that, if electrically
    stimulated, will control Parkinsons disease
    symptoms
  • New Paradigm--Understand and exploit natural
    resonances to enhance or damp system oscillations
  • Potential for pulsatile systems for drug release
  • Potential for sensor-based systems for drug
    release (e.g., smart pumps that release insulin
    in response to real time measurements of blood
    glucose)
  • Potential for timed or sensor-based electrical
    stimulation of target tissues (e.g., heart
    pacemakers)

8
Key Idea for Transforming Risk Assessment for
Traditional Toxic Effects
  • Quantitatively characterize each uncertainty
    (including those currently represented by
    uncertainty factors) by reducing it to an
    observable variability among putatively analogous
    cases.
  • Human interindividual variability--kinetic and
    dynamic
  • Variation in sensitivity between humans and test
    species
  • Adjustment for short- vs. longer periods of
    dosing and observation
  • Adjustment for database deficiencies (e.g.
    missing repro/developmental studies)
  • This general approach is not without
    difficultyneed rules for making the analogies
    (defining the reference groups to derive
    uncertainty distributions for particular cases).
  • However it does provide a way forward for health
    scientists to learn to reason quantitatively from
    available evidence relevant to specific
    uncertainties.

9
Examples of Data Bases Assembled/Analyzed
10
Additional Data Bases Analyzed And/Or Assembled
Type of Projection Parameters Original Authors
Interspecies Sensitivity--Multi-Dose and Carcinogenesis Human Maximum Tolerated Dose and Putative Animal Equivalents for 61 Anti-Cancer Agents Price et al., 2001 Hattis et al., 2002
Pharmacokinetics in Pregnancy--Parameters Derived from PBPK Model Fits or Direct Observations Fetal Growth, Placental/Fetal Transfer, Maternal Tissue Growth, Partition Coefficients Hattis, 2004 Report to EPA
Adult/Early Life Stage Carcinogenic Animal Bioassay Sensitivity for 9 Mutagenic Agents Ionizing Radiation Cancer Transformations Per PPM, per dose/ (body weight) 0.75 or per rem ionizing radiation EPA (2005) cancer guidelines Hattis et al., (2004, 2005)
11
The Straw Man Quantitative Probabilistic
Framework for Traditional Individual Threshold
Modes of Action
  • It is ultimately hopeless to try to fairly and
    accurately represent the compounding effects of
    multiple sources of uncertainty as a series of
    point estimate uncertainty factors.
  • Distributional treatments are possible in part by
    creating reference data sets to represent the
    prior experience in evaluating each type of
    uncertainty.

12
Interpretation of Dose Response Information for
Quantal Effects in Terms of a Lognormal
Distribution of Individual Threshold Doses
13
Analytical Approach for Putative Individual
Threshold-Type Effects
  • Select Point of Departure (ED50), then define
    needed distributional adjustments
  • LOAEL to ED50 or NOAEL to ED50
  • Acute/chronic
  • Animal to human
  • Human variability, taking into account the
    organ/body system affected and the severity of
    the response
  • Incompleteness of the data base

14
Elements of the Straw Man Proposal--Tentatively
it is suggested that the RfD be the lower (more
restrictive) value of
  • (A) The daily dose rate that is expected (with
    95 confidence) to produce less than 1/100,000
    excess incidence over background of a minimally
    adverse response in a standard general population
    of mixed ages and genders, or
  • (B) The daily dose rate that is expected (with
    95 confidence) to produce less than a 1/1,000
    excess incidence over background of a minimally
    adverse response in a definable sensitive
    subpopulation.

15
Results of Application of the Straw Man Analysis
to18 Randomly-Selected RfDs from IRIS
16
Recent Results from Application of the Straw
Man Approach to Value of Information Testing of
the IPCS Data-Derived Uncertainty Factor Formulae
  • Split of PD/PK variability should be closer to
    52, rather than 3.13.1 as implied by the IPCS
    proposal (if one wishes the product to multiply
    out to the traditional 10-fold uncertainty factor
    assigned for interindividual variability
  • Approximately equal protectiveness would be
    achieved by substitution of the following values
    for the interindividual variability factor 10 for
    RfDs with the following characteristics Quan
    tal Endpoint Continuous EndpointOverall
    UF 100 17
    43
  • Overall UF 1000 7.4
    19

17
More Details of Our Analysis are Available in
  • Hattis, D., Baird, S., and Goble, R. A Straw Man
    Proposal for a Quantitative Definition of the
    RfD, Drug and Chemical Toxicology, 25 403-436,
    (2002).
  • Hattis, D. and Lynch, M. K. Empirically Observed
    Distributions of Pharmacokinetic and
    Pharmacodynamic Variability in HumansImplications
    for the Derivation of Single Point Component
    Uncertainty Factors Providing Equivalent
    Protection as Existing RfDs. In Toxicokinetics
    in Risk Assessment, J. C. Lipscomb and E. V.
    Ohanian, eds., Informa Healthcare USA, Inc.,
    2007, pp. 69-93.
  • Detailed Data Bases and Distributional Analysis
    Spreadsheetshttp//www2.clarku/edu/faculty/dhatt
    is

18
Implications for Information Inputs to Risk
Management Decision-Making
  • Increasingly cases such as airborne particles,
    ozone, and lead are forcing the recognition that
    even for non-cancer effects, some finite rates of
    adverse effects will remain after implementation
    of reasonably feasible control measures.
  • Societal reverence for life and health means
    doing the very best we can with available
    resources to reduce these effects.
  • This means that responsible social
    decision-making requires estimates of how many
    people are likely to get how much risk (for
    effects of specific degrees of severity) with
    what degree of confidencein cases where highly
    resource-intensive protective measures are among
    the policy options.
  • The traditional multiple-single-point uncertainty
    factor system cannot yield estimates of health
    protection benefits that can be juxtaposed with
    the costs of health protection measures.

19
Alternative Vision2--Multiple Directions for
Improvement for Toxicology and Risk Assessment
  • New Pharmacodynamic Taxonomies and Approaches to
    Quantification
  • Taxonomy based on what the agent is doing to the
    organism
  • Taxonomy based on what the organism is trying to
    accomplish and how agents can help screw it up
  • Quantitative Probabilistic Framework for
    Traditional Individual Threshold Modes of
    Action

20
Taxonomy Based on What Organisms Need to
Accomplish to Develop and Maintain Functioning,
and What Can Go Wrong
  • Establishment and Maintenance of Homeostatic
    Systems at Different Scales of Distance, Time,
    Biological Functions, Involving
  • Sensors of Key Parameters to Be Controlled
  • Criteria (E.g. Set Points) for Evaluating
    Desirability of Current State
  • Effector Systems That Act to Restore Desirable
    State With Graded Responses to Departures
    Detected by the Sensors
  • Some Examples of Perturbations
  • Hormonal Imprinting by Early-Life Exposure to
    Hormone Agonists
  • The Tax Theory of General Toxicant Influences
    on Fetal Growth, and Possible Consequences

21
Key Challenges for Biology in the 21st Century
  • How exactly are the set points set?
  • How does the system determine how, and how
    vigorously to respond to various degrees of
    departure from specific set points?
  • Could all this possibly be directly coded in the
    genome?
  • Or, more interestingly, does the genome somehow
    bring about a learning procedure where, during
    development, the system learns what values of
    set points and modes/degrees of response work
    best using some set of internal scoring system?
  • How exactly are the set points, etc., adjusted to
    meet the challenges of different circumstances
    (allostasis states--see Shulkin 2003,
    Rethinking Homeostasis).

22
(No Transcript)
23
(No Transcript)
24
(No Transcript)
25
(No Transcript)
26
(No Transcript)
27
Taxonomy Built from the Fundamental Ways that
Chemicals Act to Perturb Biological Systems
  • For preclinical stages or subclinical levels of
    effect, is the basic action reversible or
    irreversible, given a subsequent period of no
    exposure?
  • Reversible (enzyme inhibition receptor
    activation or inactivation--Traditional Acute or
    Chronic Toxicity--traditional toxicology/homeostat
    ic system overwhelming framework (individual
    thresholds for response)
  • Irreversible (DNA mutation destruction of
    alveolar septa destruction of most neurons)

28
Subcategories for Nontraditional (Based on
Irreversible Changes) Modes of Action
  • How many irreversible steps are needed to produce
    clinically recognizable disease?
  • (few--up to a dozen or so) ?molecular biological
    diseases--mutations, cancer via mutagenic
    mechanisms
  • (many--generally thousands) ?chronic cumulative
    diseases (emphysema and other chronic lung
    diseases cause by cumulative loss of lung
    structures or scarring, Parkinsons and other
    chronic neurological diseases produced by
    cumulative losses of neurons)

29
Special Features of Chronic Cumulative Disease
Processes
  • Clinical consequences depend on the number of
    irreversible steps that have occurred in
    different people (often little detectable change
    until a large number of steps have occurred).
  • Effects occur as shifts in population
    distributions of function.
  • Thresholds for the causation of individual damage
    steps must be low enough that the disease
    progresses with time in the absence of exposures
    that cause acute symptoms.
  • Different kinds of biomarkers needed for
  • Accumulated amount of damage/dysfunction (e.g.
    FEV1)
  • (most powerful for epidemiology based on
    associations with short term measurements of
    exposure) Todays addition to the cumulative
    store of damage (e.g.
  • excretion of breakdown products for lung
    structural proteins
  • blood or urine levels of tissue-specific proteins
    usually found only inside specific types of cells
    such as heart-specific creatinine kinase for
    measurement of heart cell loss due to infarctions)

30
Toward Risk Assessment Models for Chronic
Cumulative Pathological Processes
  • Describe the fundamental mechanism(s) that causes
    the accumulation of the individual damage events
    (especially the quantitative significance of
    various contributory factors). Key
    aid--biomarkers of the daily progress of damage
    (e.g. key enzyme released from a dying neuron of
    the specific type involved in the disease)
  • Quantitatively elucidate the ways in which
    specific environmental agents enhance the
    production of or prevent the repair of individual
    damage events
  • Describe the relationships between the numbers,
    types, and physical distribution of individual
    damage events and the loss of biological function
    or clinical illness. Key aid--biomarkers for the
    accumulation of past damage, e.g. FEV1.

31
Motivation to Move On
  • Younger generation of analysts will ultimately
    not tolerate older procedures that fail to
    provide a coherent way to use distributional
    information that is clearly relevant to the
    factual and policy issues.
  • Younger generation of analysts will have greater
    mathematical and computational facility,
    particularly as biology becomes quantitative
    systems biology with quantitative feedback
    modeling--increasingly an applied
    engineering-like discipline.
  • Legal processes will ultimately demand use of the
    best science as this becomes recognized in the
    technical community.
  • Newer information/communication tools will foster
    increasing habits and demands for democratic
    accountability experts worldwide will
    increasingly be required to expose the bases of
    their policy recommendationsleaving less room
    for behind-the-scenes exercise of old boy
    safety factor judgments.

32
Contributions of High-Throughput Testing in
Different Decision Contexts
  • Preliminary evaluation of large numbers of new,
    and old but untested, environmental agents--good
    promise to be helpful.
  • Evaluation of contaminated sites (e.g.
    superfund)--support for decision-making based on
    complex mixtures is highly dubious.
  • Assistance to epidemiological research to locate
    contributors for human disease (e.g.
    asthma)--also doubtful.
  • Assessment of relative risks of different
    industrial processes--fanciful because of the
    need to guess the relative weights to be assigned
    to numerous dissimilar findings on short term
    tests without established quantitative
    connections to adverse health effects.
  • High profile choices of degrees of
    control/exposure reduction to be mandated for
    specific agents (slow throughput
    decision-making)--Likely to muddy the waters by
    raising difficult mode of action questions that
    can only be resolved by expensive and slow in
    vivo testing--e.g. using knockout mice. This
    is in fact the most likely near term
    contribution.
About PowerShow.com