USING THE LITERATURE Finding and evaluating evidence and information - PowerPoint PPT Presentation

1 / 69
About This Presentation
Title:

USING THE LITERATURE Finding and evaluating evidence and information

Description:

... exposed to hypothesised causative variable, one not. ... Best for testing causative hypotheses. Disadvantages. Feasibility. Cost. Limits on generalisability ... – PowerPoint PPT presentation

Number of Views:180
Avg rating:3.0/5.0
Slides: 70
Provided by: Peter624
Category:

less

Transcript and Presenter's Notes

Title: USING THE LITERATURE Finding and evaluating evidence and information


1
USING THE LITERATUREFinding and evaluating
evidence and information
  • Dr Peter Orpin UDRH
  • Ms Danielle Williams MRI

2
Using the Literature
  • What question/puzzle/issue are you trying to find
    information on, or an answer to?
  • Searching the literature
  • Constructing a review of the literature
  • Assessing credibility
  • Writing a literature review
  • Using the evidence is it significant and is it
    significant

3
PURPOSE Form follows function
Be clear in your own mind what function your
literature review is intended to fulfil
  • Reviewing the Literature
  • Writing a Literature Review

4
Reviewing the Literature
  • Systematic - but no trite formulas
  • Wide-ranging/eclectic
  • Unpredictable (and frustrating)
  • Obtain or construct a bibliographic database -
    EndNote
  • Construct your review in writing as you go
  • Progressively focussed over time the trick is
    to know when to stop fishing and start focussing
  • Continuous task keep reading
  • Much of what has to be done wont make it into
    your literature review.

5
SET THE WIDER CONTEXT/THE FIELD OF KNOWLEDGE
SITUATE YOUR AREA OF INTEREST IN THAT CONTEXT
IDENTIFY THE GAPS AND PUZZLES THE SPECIFIC
QUESTION TO ANSWER
CONSTRUCT YOUR DETAILED ARGUMENT
6
The way I approach it
  • Browse widely using titles and abstracts
  • Download full text only when I need to read the
    whole paper i.e. I cant get what I need from
    the abstract
  • All references into Endnote at download
  • Begin constructing literature review from first
    reference read use Outline View for
    brainstorming structure
  • Add references to structure using notes and
    Endnote references
  • Further sort and construct my argument using
    thematic analysis

7
The Systematic Literature Review
  • Transparent and pre-determined search strategy
  • Well defined question
  • Defined set of keywords
  • Defined set of databases
  • Advantages
  • Transparent (rigour) and repeatable
  • Disadvantage
  • Limited, although may be a post-facto construction

8
An Example
  • Definitions
  • Young-old 65-74 old-old 75-84 oldest-old 85
    Scott, 1997 2753- characteristics of ageing
    vary across these categories
  • Demography
  • Major driver of an ageing society is increasing
    survival rates at later age i.e. people in
    their 80s, 90 and even 100s are less likely to
    die paper posits genetic and non-genetic
    (environmental) interactions Vaupel, 1998 2875
  • Engagement/Disengagement
  • Continuity important in old friendships Shea,
    1988 2798
  • Gender
  • Two point cross-sectional study - higher levels
    of instrumental support associated with greater
    onset of disabilities of daily living in men but
    not women independent of baseline disability
    possibly receiving instrumental support leads to
    loss of self-efficacy and self image Seeman,
    1996 2760
  • Social network support correlated more with
    psychological health in women and physical health
    in men Seeman, 1996 2760

9
Evidence for What? Defining the Question
  • The Task (a program logic approach)
  • What do you ultimately want to do with the
    information/evidence you find?
  • What questions and sub-questions do you need to
    ask of the literature?

10
Some Searching Tips
  • Think outside the box be adventurous play - it
    only costs you time
  • Keywords find the key that opens the door
  • Follow the trials
  • Range wide at first narrow as you go
  • Learn to skim Sort by titles, then abstracts and
    only then full-text
  • Keep a running record
  • Construct your argument as you go

11
Gathering the Evidence - Sources
  • Web Searching
  • Databases
  • Bibliographies
  • Systematic and Literature Reviews
    http//www.ncbi.nlm.nih.gov/entrez/query/static/cl
    inical.shtmlstudycat
  • Library Browsing
  • Journals electronic journals on the web
  • Networks
  • Professional
  • Interest
  • Conferences/Seminars
  • List Serves

12
Web Searching
  • Search Engines - Google
  • Google Scholar - http//scholar.google.com/
  • Live Search Academic
  • http//search.live.com/results.aspx?scopeacadem
    icq
  • Advanced Searching - CrossSearch
  • http//www.utas.edu.au/library/info/crosssearch/cr
    osssearch.html
  • AskNow http//www.asknow.gov.au/index2.html
  • Deep/Invisible Web
  • Complete Planet
  • Health Portals
  • Professional Bodies -
  • Government -
  • NGOs
  • Universities/Research Centres
  • Libraries

13
Gathering the Evidence Database Searching
  • Training resources
  • Keywords Thesaurus, Exploding terms
  • Cited reference search articles citing an
    author or a paper
  • Using Booleans
  • And all terms together within a record
  • Or any of the terms within a record
  • Not exclude records containing that term
  • Same all terms within same sentence
  • Using to encompass phrases
  • Field Tags which fields of record e.g. Ti
    title Au - author
  • Extenders/Wildcards/Truncation replace range
    letters
  • End or middle but not beginning. E.g. 0-n
    characters 0-1 character
  • Limiting searches dates, language, article type
  • Alerts

14
Meta-Analysis
  • Letting the experts do the work
  • Review Collaborations
  • Cochrane Collaboration
  • Campbell Collaboration
  • Best Practice/Treatment Guidelines
  • Government Omni, Health Insite
  • NGOs
  • Professional Bodies
  • Journals
  • Review Articles Handout 7.1 Comparing Reviews

15
Assessing Review Articles
  • Handout 7.1
  • Opinion Piece
  • Traditional Lit. Review
  • Summary/Appraisal of
  • Selected Research
  • Systematic Reviews
  • Increasing
  • Scope/depth,
  • System,
  • Transparency
  • in the selection of literature

Decreasing Potential for hidden bias
16
(No Transcript)
17
(No Transcript)
18
Standard Search
19
Standard Search - Medicine
20
Advanced Search
21
Thesaurus
22
Alerts
23
(No Transcript)
24
(No Transcript)
25
(No Transcript)
26
(No Transcript)
27
Government Sponsored Site
28
Academic Site
29
Journal Site
30
Out of left field!
31
Assessing Credibility A couple of tips
  • Judge individual papers in the context of the
    literature overall not in isolation thats why
    you construct a running critical analysis
  • If you are not methodologically competent and/or
    confident rely on source credibility and validity

32
Threats to Evidence Credibility
  • BIAS
  • Is there adequate bias control - in sampling,
    interpretation, attrition, reporting - through
    sound methodology?
  • UNSUPPORTED CAUSAL ASSUMPTIONS
  • Is there a causal link or only a correlation or
    covariance?
  • INAPPROPRIATE GENERALISATION
  • What degree of generalisation is justified by the
    sampling methodology?
  • ATTRIBUTING REAL STATUS TO CHANCE FINDINGS
  • Is the result STATISTICALLY SIGNIFICANT?
  • Inferential statistics How confidently can the
    results from a sample be applied to the whole
    population from which it is drawn?

33
Evaluating Studies How credible is the evidence?
  • How credible is the source
  • What levels of evidence does it constitute?
  • What sample and how was it selected?
  • Does the sample support the generalisation or
    transferability claims?
  • What possible sources of bias and are they
    controlled?
  • Reliability and validity of measures
  • Judging qualitative versus quantitative studies
  • Interpretation are there causal inferences and
    are they supportable

34
Getting someone else to do the work Assessing
Review Articles
  • Opinion Piece
  • Traditional Lit. Review
  • Summary/Appraisal of
  • Selected Research
  • Systematic Reviews
  • (e.g. Cochrane Collaboration
  • http//www.cochrane.org)
  • Increasing
  • Scope/depth,
  • System,
  • Transparency
  • in the selection of literature

Decreasing Potential for hidden bias
35
1. Assessing Source Credibility
  • Publication Type
  • Journal Article academic/professional
  • Conference proceedings - Standing
  • Reports
  • Web Documents publisher/sponsor
  • News-letters Mass media links to original
    source
  • Publishing Organisation
  • Government
  • Academic
  • Professional Bodies
  • NGOs
  • Peer Review Process
  • Journal Rankings

36
2. Study Types
  • Observational real world
  • Epidemiological revealing patterns through
    counts
  • Descriptive
  • Analytical
  • Ecological/Correlational
  • Cross-sectional Longitudinal
  • Case Control
  • Cohort
  • Experimental
  • Randomised controlled trials
  • Field and community trials

37
Observational Studies 1
  • Simple Descriptive routine population data
    examined for patterns prevalence surveys
  • Ecological/Correlational Comparing
  • Same population different times
  • Different populations at same time
  • Cross-sectional and/or Longitudinal Compare
    prevalence rates of study phenomenon and
    hypothesised causal variable

38
Observational Studies 2
  • Case Control - Subjects selected by condition
    status
  • Cases condition present
  • Controls condition absent
  • Cohort Studies study comparing condition
    prevalence rates over time in two groups one
    exposed to hypothesised causative variable, one
    not.
  • Case Reports/Case Series

39
Experimental/Intervention
  • Randomised Control Trial (RCT)
  • A variation on the prospective cohort
    (experimental /control) study under researcher
    controlled conditions
  • Controls can be
  • Unblinded
  • Cross-over
  • Blinded
  • Double blinded
  • Advantages
  • Good bias and variable control
  • Best for testing causative hypotheses
  • Disadvantages
  • Feasibility
  • Cost
  • Limits on generalisability

40
NHMRC Levels of Evidence
  • I - Systematic review of all relevant RCTs
  • II - At least one properly designed RCT
  • III-1 - One well-designed pseudo-randomised
    trial
  • III-2 Un-randomised comparative, cohort, case
    control, time series without control studies
  • IV Case series, pre-test post-test studies

41
Sampling and Generalisation
Sample The units actually measured
Sampling Frame The list from which the sample
is selected
Population The group the sample seeks to
generalise to
42
4. Sampling 2
  • Random
  • Each individual element of the study population
    has exactly the same chance of being chosen
  • Allows the use of inferential statistics error
    estimate in generalising from the sample to the
    study population
  • Results can be applied only to population within
    that frame
  • Simple random all units numbered
  • Systematic random a system approach to identify
    study units

43
Sampling 3
  • Stratified/Structured Sampling
  • Elements drawn randomly from homogeneous
    sub-groups in proportion to their representation
    in the study population e.g. age groupings,
    gender.
  • Overlaid on simple random or systematic random
  • Relies on accurate data on subgroup distribution
  • Cluster Sampling
  • Multi-stage process where an exhaustive list of
    all individual elements in the study population
    is unavailable
  • Stage 1 identify groups of elements (clusters)
  • Stage 2 randomly sample these groups
  • Stage 3 randomly sample elements within the
    groups
  • Process of list and sample can work through
    multiple layers for very large populations.

44
Sampling 4 Non-random
  • Limited generalisability/transferability
  • Not amenable to error and significance
    calculation unless shown to conform to normal
    distribution curve.
  • Purposive Sample
  • Non-random based on purpose of the study and
    researchers knowledge of the population
  • Convenience Sample
  • Non-random based on available/convenient
    elements e.g snowball sampling

45
Sampling Issues
  • Power is the sample size sufficient to avoid
    statistically falsely rejecting a true finding
    i.e. falsely upholding the null (no difference)
    hypothesis.
  • Sampling Frame is the list of elements from
    which the sample is chosen
  • Clearly defined and,
  • appropriate for making findings about the study
    population
  • Generalizability Does the sample size, selection
    method and composition support generalising the
    findings to the study population.

46
5. Bias and Controls
  • Bias quality in measurement method which leads
    to misrepresentation of the measure in a
    particular direction
  • Selection bias addressed by randomisation
  • Sample selection do all elements have an equal
    chance of being selected
  • Intervention do all subjects have an equal
    chance of being allocated to a treatment or
    control group
  • Detection bias addressed by blinding
  • Is there a conscious or unconscious tendency to
    interpret findings in a way that supports a
    particular outcome or hypothesis

47
Bias and Controls 2
  • Attrition bias are results skewed because of
    bias in dropout
  • Reporting bias is there any bias in the way in
    which study outcomes are selected for report
    successful or unusual - unique cases are
    interesting but not generalisable.
  • Controls
  • Matched to intervention sample in all important
    variables apart from intervention itself can be
    a bold assumption
  • Will account for unknown confounding variables if
    these are evenly distributed in intervention and
    control groups best achieved by randomisation

48
6. Reliability of Measures
  • Are results reproducible, the findings robust
  • By the same researchers across time and study
    repeats?
  • By other researchers using the same methodology
  • Using other methodologies and instruments
    triangulation/crytallisation
  • Is it consistent across a range of conditions

49
Validity
  • Theoretically Veracity is it a true measure
    of the phenomenon under study?
  • Practically and generally - Does it make logical
    sense in terms of what else we know?
  • Face validity is it congruent with widely shared
    knowledge and understandings
  • Predictive validity does it accord to findings
    using related or dependent measures
  • Construct validity (social science) does it fit
    into a known logical relationship between
    variables
  • Content validity (social science) does it cover
    the range of possible meanings around a
    concept/variable

50
7. Interpretation Causal Inferences
  • Correlation does not equal causality
  • The Hume Problem we can refute hypotheses but
    while results can support hypotheses, they cannot
    prove them
  • Some tests
  • Reliability study quality
  • Temporality does the imputed cause precede the
    effect?
  • Strength of relationship
  • Consistency does it hold across studies and
    conditions?
  • Plausibility does it make sense?
  • Dose-exposure response does change in cause
    reliably result in an equivalent change in
    effect?
  • Are there other possible explanations?
    confounding variables

51
Variables Causal Relationships
INTERVENING VARIABLE
ANTECEDENT VARIABLE
INDEPENDENT VARIABLE
DEPENDENT VARIABLE
EXTRANEOUS VARIABLE
52
Causation Some Pitfalls 1
  • Assuming two variables are causally linked simply
    because they co-vary
  • Assuming that the effect (change in dependent
    variable) is the direct result of the putative
    causal (independent) variable
  • Placebo/Hawthorne effect
  • Unexamined confounding variables common
    antecedent or intervening
  • Serendipity

53
Causation Some Pitfalls 2
  • Anecdotal versus probabilistic evidence
  • Anecdotal evidence you cannot infer a general
    pattern from a small number of selective cases
  • Availability heuristic biased convenience
    sampling
  • Representative heuristic unconscious bias
    relying on stereotyping/intuition instead of
    reliable evidence bases
  • Ecological Fallacy trying to predict individual
    cases from general patterns and probabilistic
    evidence

54
Real Effect?
  • Could the effect be the result of
  • BIAS?
  • Is there adequate bias control - in sampling,
    interpretation, attrition, reporting - through
    sound methodology?
  • CHANCE?
  • Is the result STATISTICALLY SIGNIFICANT?
  • Inferential statistics How confidently can the
    results from a sample be applied to the whole
    population from which it is taken?

55
Statistical Significance
  • Mathematically calculated based on Standard
    Error - level of confidence that the effect is
    real i.e. not due to chance as the result of an
    inadequate or biased sample
  • Usually expressed as a percentage confidence (95
    or 99) that the effect is real
  • plt0.05 less than 5 chance it is a chance
    finding
  • plt0.001 less than 1 chance it is a chance
    finding
  • The higher the confidence limit
  • The greater the certainty that the statistically
    significant effect is real
  • The greater the possibility of rejecting a true
    finding as chance

56
The Normal Curve
57
Statistical Significance 2
  • Power
  • A measure of the likelihood of a false negative
    finding (erroneously accepting the null
    hypothesis) at a given confidence level
  • Calculated generally on the effect of the sample
    numbers and likely effect size on standard error.
  • Statistical significance versus effect size
  • A small (even clinically irrelevant) effect size
    can still be highly statistically significant
    (high level of confidence that an effect is real)
    because of the contributions of sample size,
    variability of population and chosen confidence
    limits (p value) to the calculation.

58
Judging Qualitative Studies
  • Sampling
  • Usually purposive or convenience - rarely
    randomised or controlled, although sometimes
    systematic or structured
  • Sample size and bias less important than
    representativeness of phenomenon under study
  • Statistical Power
  • Largely irrelevant rich details not patterns
  • Methodological Integrity
  • Is the methodology systematic and rigorously and
    transparently applied

59
Judging Qualitative Studies cont.
  • Transparency
  • Is there sufficient detail to understand the
    basis on which decisions methodological and
    interpretive have been made?
  • Triangulation/Crystallisation
  • How well does it fit with the picture provided by
    other studies of the same or similar phenomena
  • Peer Review
  • Has the study been subjected to critical academic
    appraisal before and after publication
  • Face Validity
  • Do the findings appear to have logical,
    experiential and rhetorical veracity

60
The literature Review
  • THE PUBLIC PRODUCT of a review of the literature
    Addressed to an audience
  • Sometimes may simply review the state of
    knowledge/evidence in a field BUT for most
    purposes
  • MAKES A CASE/BUILDS AN ARGUMENT
  • What is your case/argument?
  • Why are you making it?
  • DEMONSTRATES A GRASP OF THE FIELD
  • BUT only enough to support the case/argument
  • IDENTIFIES THE KNOWLEDGE GAP(S) justification
    for (your) research

61
SET THE WIDER CONTEXT/THE FIELD OF KNOWLEDGE
SITUATE YOUR AREA OF INTEREST IN THAT CONTEXT
IDENTIFY THE GAPS AND PUZZLES
SELECT YOUR PUZZLE
MAKE THE DETAILED ARGUMENT FOR YOUR PIECE OF
RESEARCH
62
Implications of the evidence for problem/practice?
  • A difference is only a difference when it makes a
    difference
  • Three Questions (NHMRC Guidelines for medical
    evidence assessment)
  • Is there a real effect ? (Statistical
    Significance)
  • Is the size of the effect clinically important?
  • Is the evidence relevant to (my) practice?

63
Assessing and applying Scientific Evidence (NHMRC
2000)
Evidence from Systematic Review of the literature
Step 1 Assess the evidence
Strength of Evidence
Size of Effect
Relevance
  • How large was the effect?
  • Were appropriate and relevant outcomes measured?
  • Did the study design eliminate bias?
  • Was the effect clinically important?
  • How well were the studies done?
  • Is it statistically significant?

Prepare Evidence Summary Checklist
Step 2 Apply the evidence
Transferability
Application to Individuals
  • What are the beneficial and harmful effects for
    the patient?
  • What are the predicted absolute risk reductions?
  • Do these effects vary with different patient
    groups?
  • Do the benefits outweigh the harm
  • Do they vary by baseline risk?

Adapted from NHMRC How to use the evidence
assessment and application of scientific evidence
2000 http//www.nhmrc.gov.au/publications/_files/c
p69.pdf Accessed August 2005
64
Is the effect size clinically relevant?
  • Factors in ranking clinical importance 2
  • Does the evidence
  • Address the original question/problem?
  • Provide applicable outcomes?
  • How does is compare with present alternatives
    when the magnitude of the effect is weighed
    against
  • Costs social, economic, personal, convenience
  • Risks and losses
  • Logistics of implementation

65
Is the effect size clinically relevant?
  • Factors in ranking clinical importance 1
  • Where does confidence limits range lie in respect
    of clinical importance and the null hypothesis
  • Is the effect statistically significant

Clinically Important
Effect Size
Null Hypothesis
66
Is the evidence applicable to (your) practice?
  • Transferability/Generalisability can you
    transfer or generalise the result to your
    situation?
  • Are your clients sufficiently similar to the
    study sample and/or study population?
  • Are you dealing with other variables or
    conditions not taken into account in the original
    study?
  • Peculiarities of client base?
  • Particularities of the practice environment?
    esp. resources.

67
Is the evidence applicable to (your) practice?
  • Triangulation/Crystallisation
  • Meta-analysis/Review Processes rarely produce
    clear-cut answers
  • Triangulation spatial allusion better fixing
    point in space by multiple measures from
    different perspectives
  • Crystallisation is more apposite elucidating a
    complex issue by the incremental gathering and
    assembling of evidence

68
Evaluating the Evidence A checklist
  • A Checklist
  • What is the question/issue/problem?

  • Search History
  • Keywords Searched
  • Websites Searched (Health portals, government
    sites, NGO and professional sites)
  • Databases Searched
  • Journals Searched
  • Other Sources

69
References/Sources
  • 1. Victorian Health Promotion Foundation.
    Evidence-based practice in public health and
    health promotion A two day professional
    development course for managers and policy
    makers. Melbourne VicHealth 2004.
  • 2. Beaglehole R, Bonita R, Kjellstrom T. Basic
    Epidemiology. Geneva World Health Organisation
    1993.
  • 3. National Health and Medical Research Council.
    How to use the evidence assessment and
    application of scientific evidence. Canberra
    National Health and Medical Research Council
    2000.
Write a Comment
User Comments (0)
About PowerShow.com