The Joys of Critiquing Quantitative and Qualitative Research - PowerPoint PPT Presentation

1 / 42
About This Presentation
Title:

The Joys of Critiquing Quantitative and Qualitative Research

Description:

Did the researcher ask the right questions and use the right techniques? These are three main questions to be ... confuse and dishearten. Quantitative methods ... – PowerPoint PPT presentation

Number of Views:2615
Avg rating:3.0/5.0
Slides: 43
Provided by: jerome1
Category:

less

Transcript and Presenter's Notes

Title: The Joys of Critiquing Quantitative and Qualitative Research


1
The Joys of Critiquing Quantitative and
Qualitative Research
2
The questions
  • Is the research worth putting into practice?
  • How do you know how good the research is?
  • Did the researcher ask the right questions and
    use the right techniques?
  • These are three main questions to be asked of a
    piece of published or unpublished research
  • (Tidy, 2000)

3
Principles of Critiquing
  • Research critique critical appraisal critical
    evaluation inter-changeable
  • At its most basic critiquing is making a value
    judgement on what is reported (Parahoo 1998)
  • Particular attention is paid to..Aim of the
    research, methodology and findings
  • An unbiased and non-prejudiced consideration of
    these areas

4
  • Critiquing is not only about description of
    research but also judgement
  • The quality of the research is closely tied to
    the kinds of decisions the researcher makes in
    conceptualizing, designing, and executing the
    study and in interpreting and communicating the
    study results (Polit and Hungler 1995)
  • Why is this a crucial insight into the skill of
    critiquing?

5
  • It is not uncommon for researchers who make
    different research decisions to arrive at
    different answers to the same question
  • Any examples?
  • Therefore as a research consumer you must ask
    some questions
  • What decisions did the researcher make?
  • Because of them - can I trust the process?
  • Why? What other approaches could have been used?
    Would they have yielded more trustworthy results?

6
  • Have the decisions which the researcher has taken
    any impact upon the ability of the study to
    reveal the truth?
  • To critique properly, two things are required
  • 1. A set of structured criteria
  • 2. Skill

7
  • Structured Criteria
  • Title of the article
  • Abstract
  • Literature review
  • Methodology
  • Results
  • Discussion and interpretation
  • Recommendations

8
  • Allied to this consider these recommendations
  • 1.Be sure to comment on the studys strengths as
    well as its weaknesses. Be balanced.
  • 2. Avoid vague generalisations..be specific
  • 3. Justify your criticism..why should things have
    been different?
  • 4. Be objective - dont be overly critical, e.g.
    if you dont like the topic or methodology

9
  • 5. Dont patronise, be sarcastic or be
    condescending. Be constructive..you might be a
    researcher someday!
  • 6. Practically, how might the researcher improve
    upon what has been done?
  • 7. Evaluate all aspects of the study - substance,
    method, interpretation, ethics and presentation
  • (Polit and Hungler 1995)

10
Consider the following questionsWho did what to
whom?Why and how and when did they do it?What
was the background to the study?What did they
find?Was it morally and ethically sound?
11
Title of the article
  • Here there is not really a right or wrong title -
    just a misleading or confusing one. (Parahoo
    1998)
  • So what should we look for..
  • Clarity
  • Accuracy
  • Elegance
  • Precision - phenomena under investigation and the
    population to which it refers

12
Abstract
  • Be fair - dont expect the article in miniature
  • Do expect
  • Short summary
  • The aim of the study
  • The design (inc. methods, sample and sampling)
  • The main findings

13
Literature Review
  • Polit and Hunglers 9 point Guidelines
  • (1995, p579)
  • 1. Does the review seem thorough? All or most of
    the major studies on the subject? Recent
    literature?
  • 2. Overdependence on secondary sources when
    primary sources could have been obtained?

14
  • 3. Overreliance on opinion articles and
    underreliance on research studies?
  • 4. Does the content of the review relate directly
    to the research problem or is it only
    peripherally related?
  • 5. Is the review merely a summary of past work or
    does it critically appraise and compare the
    contributions of key studies? Does it discuss
    weaknesses in existing studies and gaps in the
    literature?
  • 6. Does the review paraphrase adequately, or is
    it a string of quotations from the original
    sources?

15
  • 7. Does the review use appropriate language? Is
    it objective?
  • 8. Is the review well organized? Is the
    development of ideas clear? Does it lay the
    foundation for undertaking the study?
  • 9. Does the review conclude with a brief synopsis
    of the state-of-the-art of the literature on the
    topic?
  • Marleys 1 point guideline
  • 1. Is it elegant?

16
Methodology
  • We first need to determine if the study is
    quantitative, qualitative or a mixture of both
  • To be covered later in the session

17
Clear Questions, Objectives or Hypotheses?
Adequate Operational Definitions?
Most appropriate design for phenomena?
Methodology
If borrowed, has it been modified?
Reliability/Validity now?
Was the instrument borrowed?
How was the data collected? Are the tools valid
reliable?
18
Validity and reliability
  • Validity - does the measure accurately reflect
    the the concept it was intended to measure eg
    time in library IQ
  • Relative validity is based on face, criterion,
    content, construct, internal, and external
    validity
  • Reliability - the same data would be collected
    each time in repeated observations of the same
    measure - stability of measure

19
Research design
  • The approach should be explained
  • Theoretical or conceptual frameworks explained
  • Aims and objectives clear and consistent with the
    purpose of research
  • Design linked to research questions/aims/objective
    s or hypothesis
  • Pilot should be conducted to test feasibility of
    instruments

20
Data analysis
  • Methods of analysis explained and appropriate for
    type of data
  • Appropriate statistical tests used and correctly
    performed
  • Tables/graphs labelled and understandable
  • Does qualitative data identify themes with
    narratives supporting emergent themes
  • How was data validated, no evidence of bias

21
Has the researcher clearly described their
intent?
Exactly how will the data be collected?
  • Sampling Method
  • Who was selected?
  • From what type
  • of population?
  • Response rates?
  • Transferability
  • Dependability
  • Confirmability

Steps to ensure rigor?
Qualitative Data
What influence does the researcher have on the
interaction?
22
Ethical considerations
  • Worthwhile aims justifiable research
  • Appropriate design
  • No coercion/inducements of subjects
  • Confidentiality protected
  • Informed consent, right to withdraw
  • Risks and discomforts acceptable

23
Results
  • Beware, researchers are often selective in the
    results they publish
  • Refer back to the research questions or
    hypotheses stated at the start. Are these
    specifically addressed here?
  • Do tables and data presentation make sense or are
    they there to flatter to confuse?
  • Are statistics consistent, meaningful and correct?

24
  • Often expressed numerically
  • NNT number needed to treat e.g. have to treat
    1,000 people to achieve one cure would you
    consider treatment depends on context
  • OR odds ratio
  • Confidence interval, a statement of how confident
    we are a result lies between 2 points 95 usually
    acceptable

25
  • For Qualitative data,
  • What efforts have been made to promote rigor,
    authenticity and objectivity in data analysis?
  • e.g Giving transcripts back to the subjects or
    asking other researchers/experts in field to read
    them. Declaring bias, availability of running
    record of procedures, preservation of data .
  • For more specific information refer to Polit and
    Hungler (1995) pages 585, 586

26
Discussion and Interpretation
  • Presentation of results without later
    explanation or interpretation is not only
    worthless, but also incompetent and perhaps even
    deceitful.
  • Again refer back to the research questions or
    hypotheses..is there an intimate link?
  • Is the discussion balanced or selective, perhaps
    to show up positive aspects of the study?
  • Are limitations acknowledged?

27
  • If new treatments are suggested is there any
    discussion re other issuesit may be quicker and
    cheaper but it is also more painful etc.
  • Are the new findings linked with others in the
    general pool from which the study comes?
  • Is the discussion systematic, are there any gaps?
  • Is the authors view of the data defensible?
  • Do you agree with the author?
  • Why?

28
Recommendations
  • Again, is there consistency or are the
    recommendations divorced from the rest of the
    study?
  • Are recommendations for practice, education,
    policy and management appropriate in relation to
    limitations

29
The Quantitative - Qualitative Debate
30
Traditionally
  • Quantitative
  • Quantity
  • Numbers
  • Qualitative
  • Quality
  • Description
  • Such a view is unhelpful and too simplistic
  • Allied to this, terms used in the research
    process
  • when speaking about these two areas often only
  • confuse and dishearten

31
Quantitative methods
  • Typically involves selection of samples,
    allocation to groups, introduction of planned
    changes, measurement of variables, control of
    variables and possible hypotheses testing or
    quasi-experimentation which involves selection of
    natural groups e.g. a school class
  • Uses quantitative analysis - the numerical
    representation and manipulation of observations
    for the purpose of describing and explaining the
    phenomena that those observations reflect

32
  • This approach is in contrast with Qualitative
    research.
  • Common assumptions here are that phenomena are
    studied from the perspective of the respondent
    and in the natural setting.
  • Inherent in many qualitative approaches,
    especially from nurse phenomenologists is the
    suggestion thatphenomena cannot be studied
    objectively and that qualitative research offers
    a more complete description of the phenomena than
    does the objective observation of its signs and
    symptoms.

33
Horses for courses
  • The methods employed must be appropriate for the
    questions to be answered
  • Ethical and practical considerations e.g. child
    rearing practices, could you experiment e.g.
    withhold privileges v physical punishment

34
Methods of data collection
  • Selecting methods is not a value free decision.
  • Why?
  • Choices reflect the beliefs of the researcher
    about the phenomena under investigation and the
    most effective way of getting at its truth

35
What do the following methods say about phenomena?
  • Questionnaires
  • Observation schedules
  • Rating scales
  • Structured interviews
  • Un-structured interviews
  • Experiments

So when you read research try to look very
closely at the methodology to determine what the
researcher views the phenomena to be and to
determine the research paradigm from which the
study comes
36
Techniques of data analysis
ANALYSIS
  • Descriptive Statistics
  • Inferential Statistics
  • Probability
  • Chance
  • Degrees
  • Making sense
  • Finding structure
  • Themes
  • Categories
  • Interpretation

37
Common Statistical Tests
38
  • Descriptive Statistics
  • Frequency
  • Central tendency
  • Dispersion

39
  • Inferential Statistics
  • Here the desire is to generalise the findings
    from samples into equivalent populations..thats
    all there is to it!
  • ANOVA
  • T-tests
  • Correlations
  • Chi-square

40
Questions to ask after reading a research report
  • Are you sure the results are valid and reliable?
  • Are there other studies which support or
    contradict the findings?
  • Have you looked at relevant research conducted
    since the report you have been reading was
    completed?

41
  • How practical will it be to implement findings
    locally?
  • Costs/benefits - hidden costs too e.g. training,
    staff anxiety, timeliness
  • Why should there be a change will it improve
    patient care or service delivery?
  • May involve audit of existing practice and pilot
    of new practice

42
It is very important that you are able to
separate the wheat from the chaff even if you
never grow wheat on your own
Write a Comment
User Comments (0)
About PowerShow.com