Why Do Clinical Research? - PowerPoint PPT Presentation

Loading...

PPT – Why Do Clinical Research? PowerPoint presentation | free to download - id: 1d4220-ZDc1Z



Loading


The Adobe Flash plugin is needed to view this content

Get the plugin now

View by Category
About This Presentation
Title:

Why Do Clinical Research?

Description:

Healthy volunteers, then those with target disease. Pharmacokinetics. Absorption. Metabolism ... In clinical trials, design should allow relatively ... – PowerPoint PPT presentation

Number of Views:22
Avg rating:3.0/5.0
Slides: 55
Provided by: waynem60
Category:

less

Write a Comment
User Comments (0)
Transcript and Presenter's Notes

Title: Why Do Clinical Research?


1
Why Do Clinical Research?
  • Satisfaction of answering important questions
    which will improve the health of our patients
  • Status of researchers
  • Skill advancement
  • Professional advancement
  • Salary and Job Security

2
What is Research?
  • Research is the endeavor to discover new facts,
    procedures, methods, and techniques by the
    scientific study of a course of critical
    investigation

3
Clinical Research
  • Clinical research involves working with human
    subjects to answer questions relevant to their
    well-being
  • Patient oriented research is where the rubber
    meets the road!

4
How To Do Research
  • Start with defining the question
  • Write down a clear aim
  • Divide the problem into smaller, answerable
    questions

5
How To Do Research
  • Develop hypotheses
  • Decide what data is needed to test the
    hypotheses
  • Refine the above and check the line of thought

6
Good Research
  • CLEAR
  • Essential for both the problem and the answer
  • ACCURATE
  • Exactness and precision come from hard work and
    responsible effort
  • RELIABLE
  • If repeated will the answer be the same?

7
Good Research
  • OBJECTIVE
  • The researcher exposes all possible prejudices at
    the onset of the study design and strives to
    overcome them
  • Will the research be untarnished by personal
    gain, biases, vested interests, etc?

8
Researcher Qualities
  • Knowledgeable
  • Observant
  • Logical
  • Open-minded
  • Honest
  • Motivated
  • Independent
  • Flexible
  • Careful

9
Researcher Qualities
  • Curious
  • Inquisitive
  • Eager to learn
  • Skeptical
  • Perceptive
  • Persistent
  • Patient
  • Original
  • Creative

10
Getting Started
  • Learn your subject
  • Read, Read, Read
  • Start general and then focus
  • Begin with the problem

11
Getting Started
  • Formulate the problem as a research question
  • Reduce the question to a single unambiguous
    question that is well-defined and answerable

12
Stages in Creativity
  • SENSE
  • Realize the need for a study
  • PREPARE
  • Gather relevant information
  • INCUBATE
  • Think through the problem
  • ILLUMINATE
  • Imagine possible solutions
  • VERIFY
  • Evaluate the solutions you have generated

13
Hypothesis
  • Thesis is the position that you believe
    represents truth
  • Hypothesis is the foundation on top of which you
    build your thesis

14
Hypothesis
  • Hypothesis is a tentative construct to be proved
    or disproved according to the evidence
  • The hypothesis is sometimes expressed as a null
    hypothesis

15
A Good Hypothesis Should
  • Be testable
  • Convey the nature of the relationship being
    tested
  • State exactly what variables form this
    relationship
  • Reflect all variables of interest
  • Be formulated early on in the planning stage

16
Study Types
  • Will you test a hypothesis or describe a
    phenomenon?
  • Observational
  • Longitudinal
  • Cross-sectional
  • Randomized, double-blind, parallel group, placebo
    controlled trial

17
Epidemiology vs RCT
  • Epidemiology allows the study of the real world
    and the development of hypothesis regarding
    disease states
  • Randomized, controlled trials allow the rigorous
    testing of hypothesis in a well characterized
    manner that is less real world in nature

18
Study Design
  • Study Population
  • Age
  • Gender
  • Ethnicity/Race
  • Disease characteristics
  • Exclusions
  • Number
  • Stratification
  • Randomization

19
Human Subjects
  • The safety and rights of human subjects must be
    protected
  • Study Design
  • Institutional Review Board
  • Informed consent
  • Data Safety Monitoring/Medical Monitors

20
Key Questions
  • What is the main purpose of the trial?
  • What treatments will be used and how?
  • What is the participant risk?
  • What are the possible benefits?
  • How will patient safety be monitored?

21
Key Questions
  • Are there alternative treatments?
  • Who is sponsoring the trial?
  • What is the participant burden?
  • How long and where?
  • What do the participants have to do?
  • Will there be any discomfort even if there is no
    risk?

22
Methods
  • Define methods carefully
  • Decrease variability
  • Check reliability/reproducibility
  • Are you testing what you think you are testing?

23
Methods
  • Try to walk through the study and consider as
    many likely scenarios as possible.
  • Try to design in any variations in treatment or
    data collection that you think will occur before
    the study starts

24
Operationalize Concepts
  • Specify how you will repeatably and reliably
    measure the variables you are using to answer the
    question
  • An operational definition specifies how your
    concepts will be observed and measured
  • This should allow your research to be reproduced

25
Data
  • Data are the facts you measure
  • They should be carefully recorded in an unbiased
    manner
  • They should be measured in a manner that
    minimizes random variation
  • They should be derived from the operational
    definitions you have developed

26
Data Validation
  • Do the data make sense?
  • Look critically at the data
  • Highest and lowest values
  • Data entry errors
  • Distribution Normal or skewed
  • Check selected data entries with original data
    forms

27
Data Interpretation
  • Do not interpret/analyze data until after study
    is completed
  • Do not unblind subjects until the study is
    completed other than for safety reasons
  • Do not interpret/analyze data until after data
    has been validated and the data set closed

28
Data Interpretation
  • Use the research question and hypotheses to guide
    analyses
  • Use a priori definitions for any sub-set analyses
  • Exploration of epidemiologic data sets is OK, but
    need to avoid data mining

29
Writing It Up
  • If you dont write it, then it didnt happen
  • Order of writing
  • Methods
  • Results
  • Introduction
  • Discussion
  • Abstract
  • Title

30
Writing It Up
  • After the first draft, new analyses will usually
    be suggested by the process of putting your ideas
    down on paper
  • Put the paper away for a few weeks and then read
    it again
  • Ask mentors and colleagues to read the paper at
    the first draft stage

31
Sending It In
  • When writing the paper, have the journal you will
    submit to in mind
  • Pick journals that will match your papers topic
    and the quality and importance of your work
  • Aim high and, if needed, go low
  • Persist, Persist, Persist

32
Clinical Research
  • Drug Development

33
Drug Development
  • Preclinical/Laboratory Study
  • Cell culture in animal and human cells
  • Animal studies
  • Looking both at toxicity/carcinogenicity as well
    as effect, if relevant
  • Develop Investigational New Drug application with
    FDA (IND)

34
Phase I Studies
  • Assess drug safety and tolerability
  • Healthy volunteers, then those with target
    disease
  • Pharmacokinetics
  • Absorption
  • Metabolism
  • Excretion
  • Dose escalation
  • 70 of new drugs pass this phase

35
Phase II Studies
  • Assess drug efficacy
  • Usually randomized, controlled trials with
    smaller numbers up to several hundred subjects
  • Test different therapeutic strategies
  • Use surrogate variables and are usually short
    term
  • Only 1/3 get past phase II

36
Phase III Studies
  • Large scale RCT to assess efficacy and safety of
    medication
  • Several hundred to thousands of patients enrolled
  • Classic randomized, placebo-controlled design
  • Long-term study design with real world outcome
    variables
  • Define package insert content and allow marketing

37
Study Size and Adverse Events
  • The size of the treatment group determines the
    likely frequency of adverse events (side effects)
    that can be detected
  • A good rule of thumb is that you can detect an
    adverse event rate that is one event in the
    number of subjects divided by three
  • A study with 100 patients will only detect AEs
    that occur at a rate of 1/33 3

38
Phase IV Studies
  • Compare drugs with other drugs on the market
  • Define broader target population
  • Monitor long-term efficacy and safety
  • Conduct health economics assessment and quality
    of life study

39
Reading Clinical Research
  • How to Approach RCT Reports

40
Reading Clinical Trials
  • All that glitters is not gold by Bengt and
    Curt Furberg
  • Just because a study is published in a journal
    does not mean that it represents truth
  • Throwaways and Drug company sponsored
    newsletters have either no or limited peer review

41
Was the question stated A Priori?
  • Exploring data is acceptable to define
    hypotheses, but cannot definitively answer them
  • Primary outcomes and limited secondary outcomes
    should be carefully defined before study commences

42
Was the question stated A Priori?
  • Multiple hypothesis testing can lead to false
    association
  • P lt0.05 is subverted if there are 20 looks at the
    data

43
Is the question relevant?
  • Does the answer clarify whether the treatment
    will help patients to
  • Feel better
  • Live longer
  • Have less complications of illness
  • Are the endpoints real world or merely surrogates
  • How can one generalize the findings?

44
How is improvement quantified?
  • Are the outcomes relevant?
  • Do the measures used make sense?
  • Is the magnitude of the difference relevant to
    patient care?
  • Is the study over-powered?

45
Are the outcomes relevant?
  • Quality of life
  • Mortality
  • Health economics
  • Surrogate markers of clinical outcome
  • Surrogate biologic markers

46
How are adverse events measured?
  • Side effects are characterized as
  • Severe Treatment must be stopped, or patient
    hospitalized, or dies, or develops cancer, or has
    congenital anomaly in child
  • Moderate Dosage must be reduced, usually leads
    to discomfort, temporary disability, or reduction
    in functioning
  • Mild No change in treatment. Limited discomfort
    or dysfunction

47
How are adverse events measured?
  • AEs are characterized as to whether or not they
    are related to the medication
  • Definitely
  • Likely
  • Probably
  • Possibly
  • Not associated

48
Are the patients representative?
  • This is most problematic in pediatrics where we
    often have to extrapolate from adult studies
  • Gender, age, and race can all alter outcomes
  • Disease classification and severity can alter
    outcomes
  • High risk patients are usually excluded

49
Where the groups initially comparable?
  • Even in studies of 150-200 subjects substantive
    imbalance can occur between treatment groups
  • Was stratification used to ensure balance?
  • Did the treatment group start out sicker so that
    they likely would improve more than the placebo
    group?

50
Excluded Subjects?
  • Intent to treat analyses should be reported
  • Two unacceptable reasons to exclude subjects are
  • After randomization where they do not meet entry
    criteria
  • Because they did not take the medication

51
Do you need a statistician to read the study?
  • In clinical trials, design should allow
    relatively straightforward presentation of
    results
  • Effect size and relevance are more important than
    P values

52
Do you need a statistician to read the study?
  • Consider the number of patients who would have to
    be treated to avoid the outcome being prevented
  • Subgroup analyses should be avoided unless
    defined a priori

53
Economic Analysis
  • Of course our drug is more expensive, but we
    need to convince clinicians to use it more
  • Does the medication reduce direct or indirect
    costs or both?

54
Economic Analysis
  • Be sensitive to relationship between the authors
    and the sponsor
  • Be careful if soft assumptions are used
  • Beware of analyses based on the clinical trial
    setting and not the real world
  • Beware indirect evidence with surrogate markers
About PowerShow.com