NSR 338: Research in Nursing Dennis Ondrejka, Ph.D., R.N. - PowerPoint PPT Presentation

Loading...

PPT – NSR 338: Research in Nursing Dennis Ondrejka, Ph.D., R.N. PowerPoint presentation | free to download - id: 429498-ODExO



Loading


The Adobe Flash plugin is needed to view this content

Get the plugin now

View by Category
About This Presentation
Title:

NSR 338: Research in Nursing Dennis Ondrejka, Ph.D., R.N.

Description:

NSR 338: Research in Nursing Dennis Ondrejka, Ph.D., R.N. 702-833-3909 office d.ondrejka_at_denverschoolofnursing.org What is Evidence Based Nursing Practice? – PowerPoint PPT presentation

Number of Views:402
Avg rating:3.0/5.0
Slides: 166
Provided by: dond86
Learn more at: http://mydsn.org
Category:

less

Write a Comment
User Comments (0)
Transcript and Presenter's Notes

Title: NSR 338: Research in Nursing Dennis Ondrejka, Ph.D., R.N.


1
NSR 338 Research in Nursing Dennis Ondrejka,
Ph.D., R.N.
  • 702-833-3909 office
  • d.ondrejka_at_denverschoolofnursing.org

2
  • Introductions

3
What is Research?
  • Process of searching for new knowledge about
    phenomena
  • Validates and refines existing knowledge (Burns
    Grove, 2007)
  • Systematic process of inquiry or study
  • Builds new knowledge through the dissemination of
    findings

4
EBP Definition
  • the integration of current best evidence with
    clinical expertise and patient values (Sackett
    et al., 2000)
  • a framework for clinical practice that
    incorporates the best available scientific
    evidence with the expertise of the clinician and
    the patients preferences and values to make
    decisions about health care. (Levin, 2006)

5
Evidence-Based Practice Qs.
  • Population
  • Intervention
  • Comparison
  • Outcome
  • Time (new)

6
What is Evidence Based Nursing Practice?
7
Why Research???
  • Description
  • To identify and understand the nature of nursing
    phenomena
  • Explanation
  • Clarifies the relationship among phenomena, and
    why certain events occur

8
Why Research???
  • Prediction
  • This allows us to estimate the probability of a
    specific outcome in a given situation
  • Control
  • If we can predict, the next goal would be to
    control or manipulate the situation to produce
    the desired outcome.

9
Ways We Acquire Knowledge
  • Tradition
  • Authority
  • Borrowing
  • Trial and error
  • Personal experience
  • Role-modeling mentoring
  • Intuition
  • Reasoning
  • Inductive
  • Deductive
  • Rational
  • Unstructured
  • Research
  • Quantitative
  • Qualitative
  • Mixed / Other

10
Research Process
  • Research is a systematic, diligent inquiry that
    is necessary to address
  • What needs to be known-what is the question,
    hypothesis, or interest area
  • What research methods are needed to examine this
    question or phenomena
  • What meaning can be extracted from the study
    through data analysis to build our knowledge base
    of that subject
  • Generate outcomes and disseminate new knowlege

11
Study Components
  • Need an area of concern worthy of additional
    knowledge (Ch. 1)
  • Introduction, Background of the study, Study
    significance, Question specified, Definition of
    terms, Overview
  • Literature Review (Ch. 2)
  • Know what has been published on the subject to
    this point
  • Develop your Conceptual Framework
  • Many Ch. 1 issues need literature support

12
Study Components page 2
  • Projects end with Ch. 3, Project Strategy and
    Outcomes Expected or Attained
  • Methodology, Design, Approach, Rigors (Ch.3)
  • Clarifies for the reader the type of design being
    used and why. This too has literature supporting
    why it is being used
  • Analysis of Data (Ch. 4)
  • Explores an approach to examining the data and
    how it addresses the question(s) being asked
  • Tools used to help analyze the data

13
Study Components-3
  • Data discussion and implications (Ch. 5)
  • Continue to examine the literature after the data
    is collect to see if new perspectives can be
    addressed
  • Meaning of these results relate to the CF
  • Implications for results related to those who
    need this information
  • Weaknesses, limitations, and implications for
    further study.

14
Positivistic versus Naturalistic Inquiry
  • This is a 100 year old debate
  • Is often correlated to research methodology
  • Is a philosophy on the way we think about human
    phenomenon research
  • Can be integrated within methodology, but
    philosophically they are very different
  • Is the foundation for how we design research

15
Positivistic Inquiry Naturalistic Inquiry
(Constructivism)   Quantitative
Triangulated Qualitative
  Solomon Design Blended Designs
Post-modern -four group design - use
quantitative -pretest-treat-post test
qualitative -research self -pretest-no
treat- post test methods -novel
sounding -no pre- no treat- post test
lacks theory -random group
Quasi-Experimental Grounded Theory
Phenomenology -validated tools
-two of three -theory building
- descriptive
Exp. controls -Basic Social Process
- interpretive
- hermeneutic Descriptive
Experimental Design
- quantitative or Ethnography -random
sample qualitative methods
-living in the experience -control group
-cultural
immersion -a treatment given
Case Study -single-double cases
-In-depth analysis -
comparative analysis Action
Research Adequate time commitment Collab
orative effort Openness to change Quality
of data collection and analysis Impact on
ones practice
16
Positivistic Inquiry Naturalistic Inquiry
(Constructivism)   Quantitative
Triangulated Qualitative   Solomon
Design Blended Designs Post-modern
Quasi-Experimental Grounded Theory
Phenomenology Constant Comparative
Analysis Descriptive
Experimental Design
Ethnography
Case Study Scientific Rigors by
Design Validity Reliability (internal-external)
Conceptual Framework Developed Descriptive
Vividness Statistical Inference Methodological
Congruence Generalizability Analytical
Preciseness Temporality Theoretical
Connectedness Selection and Bias Heuristic
Relevance Measurement validity /
reliability Trustworthiness, Credibility,
Controlling confounders and Auditability Approp
riate study design for the questions Confirmabili
ty, transferability Stylistic Personal
Relevance, Heuristic
17
Sample Size by Design Positivistic Inquiry
Naturalistic Inquiry (Constructivism)   Quant
itative Triangulated
Qualitative   Solomon Design Blended
Designs Post-modern Power
Analysis
20-40
1
Quasi-Experimental
Grounded Theory
Phenomenology gt40
10-1000 10-saturation (10-30)
Descriptive
Experimental Design
1-12
Ethnography Power Analysis
1 Case Study

1-2 Action
Research ?-100
18
Assumptions of Positivistic Thinking page 1
  • Reality is singular, tangible, and can be
    dissected
  • The researcher and those being studied are
    independent
  • Time and context-free generalizations are
    possible
  • Inquiry is value-free

singular reality
value free
Positivistic thinking
independent variables
generalizable
19
Assumptions of Positivistic Thinking
  • There are real causes or at least high
    probability of a relationship.
  • We believe we can have independent and dependent
    variables as separate entities
  • Validity of a design is very critical to results

singular reality
value free
Positivistic thinking
cause effect
validity
independent variables
generalizable
20
Assumptions of Positivistic Thinking page 3
  • Reliability is based on how the design is
    reproducible
  • Generalizability is related to good internal
    validity and reliability with comparable samples
  • Hypothesis testing

value-free
reliability
hypothesis testing
singular reality
Positivistic thinking
cause effect
validity
generalizable
independent variable
21
Assumptions of Naturalistic Inquiry
  • Realities are multiple, pluralistic, and holistic
  • The researcher cannot really be separated from
    those being studied and relation-ships are
    explained
  • hypotheses are time and context bound - they are
    only working statements

multiple realities
naturalistic inquiry
hypothesis is a focus area
researcher subject connected
22
Assumptions of Naturalistic Inquiry
  • All entities are in a state of mutual
    simultaneous shaping
  • Inquiry is value-bound
  • Validity is designed into the process
  • Reliability general- izable are not concepts of
    value with this thinking

multiple realities inquiry is value bound
Naturalistic inquiry
hypothesis is a focus area
researcher subject connected
thick description
23
Differences in Scientific Rigor positivistic nat
uralistic
  • Validity
  • Internal and external reliability
  • Hypothesis testing
  • Statistical inferences
  • Independent and dependent variables
  • Variable controls
  • Generalizability
  • Descriptive vividness
  • Methodological congruence
  • Analytical preciseness
  • Theoretical connectedness
  • Heuristic relevance
  • Others

24
Data Collection Difference positivistic naturali
stic
  • Tools
  • surveys, questionnaires
  • objective assessment identification
  • Measure the dependent variable
  • Convert to numeric symbols
  • Apply statistical inferences to numbers
  • Large sample sizes help with confidence levels
  • Tool
  • it is the investigator by interview, focus
    groups, observation
  • Data is subjective and objective. It is
    collected not measured
  • Themes or clusters are identified and data is
    sorted in a theme analysis
  • The themes are supported by participants or
    experts

25
Differences in Results positivistic naturalistic
  • The exploration description of a phenomenon
  • Identification of linkages, relationships, or
    interpretations based on theory connections
  • Results are themes, clusters of ideas, or theory
    constructs
  • Statistical significance for pre-post treatment
  • Statistical correlations relationships
    identified
  • Probability of errors confidence identified
  • Causal relationships

26
Positivistic Discussion of Results
  • 250 nurses were surveyed with an 80 response
    rate or N200. Questions were rated using the
    Likert 5 scale. Question 1 had a mean of 4.2
    with a S.D. of 0.5 suggesting the nurses had
    favorable opinions about continuing education.
    Compared to a 1994 survey asking the same
    question, there was a statistical difference that
    was less favorable (mean 3.1, S.D. 0.7, plt.05)

27
Naturalistic Description
  • I sat in the classroom as a peripheral member
    staying as unobtrusive as possible. The
    instructor came out from behind her desk, sitting
    on the edge as she opened with a question that
    brought all eyes in the room to meet her own
    eyes. She paused - looked at the eyes of the
    students.
  • The instructor displayed immediacy from the
    moment she started the class.

28
Importance of Knowing Positivistic
versus Naturalistic Inquiry
  • If you believe in the holistic view of human
    interaction and phenomenon, you will never
    satisfy the positivistic requirements.
  • If you want to texturize hard data findings, you
    need another scientific rigor such as
    naturalistic data.
  • Even the current science of quantum physics and
    chaos theory requires a revised thinking - an
    inquiry that addresses the subjective.

29
Ethics and Research
  • Starts with the study purpose, design, methods of
    measurement, and subjects
  • Guidelines for all of these
  • It is still a concern today
  • More recent ethical issues are
  • Fabrication of a study
  • Falsification or forging of data
  • Dishonest manipulation of the design or methods
  • Plagiarism
  • 50 of the top 50 research institutions in US
    have been investigated for research fraud

30
Ethical Problems in History http//helix.nih.gov8
001/ohsr/mpa/45cfr46.php3
  • Nazi medical experiments (1933-1945)
  • Tuskegee syphilis study by the USPHS (1932-1972)
  • Willowbrook study (1950-1970) Hepatitis study
  • Jewish Chronic Disease Hospital study with live
    CA cells in 1960s

31
Ethical Problems in History
  • University Atomic Energy Government Exp.
  • 18 men and women injected with plutonium to
    determine body distribution (at the time said to
    be terminal) 1945-47
  • 20 subjects ages 63-83 given doses of radioactive
    radium and thorium inj. or oral. 1961-65
  • 64 male inmates at Washington St. Prison had
    testicular radiation to determine the smallest
    does to makes someone sterile. 1963-70
  • 125 retarded residents were fed radioactive ir9n
    and calcium to see if a diet rich in cereal would
    block the digestion of those two minerals.
    1946-56

32
Nuremberg Code-1949
  • Voluntary consent
  • Must yield fruitful results for society
  • Anticipated results justify the type of
    experiment
  • Avoids all unnecessary physical-mental injury
  • Cannot do studies that have a known injury or
    death unless the exp. Physician is a subject
  • Risk does not out weight humanitarian benefit
  • Proper precautions to prevent injury, dis., death
  • Conducted by qualified persons
  • Subjects can always stop the study
  • Researcher must always be ready to stop the study
    (risk)

33
Declaration of Helsinki-1964-84
  • Differentiated therapeutic vs. nontherapeutic
    research
  • Clinical vs. Basic
  • Greater care to protect subjects in
    nontherapeutic research
  • There must be a strong, independent justification
    for exposing a healthy vol. to substantial risk
  • The investigator is to protect the health and
    life of research subjects

34
The Belmont Report Three Ethical Principles
  • Principle of respect for persons
  • Right to self determination and freedom to
    participate or not
  • Principle of Beneficence
  • Do no harm to others
  • Principle of Justice
  • Treat everyone fairly without discrimination
  • Led to USDHHS Code on Ethics
  • Title 45, Part 46 (45 CFR 46)
  • Office of Human Subjects Research (OHSR) within
    NIH
  • http//helix.nih.gov8001/ohsr

35
Institutional Review Board
  • IRB review process 4-6 weeks
  • Consent forms (voluntary subjects)
  • Disclosure forms
  • Confidentiality
  • Compensation disclosure
  • Ethics documented in the research
  • Accountability to rules, regulations, and legal
    entities

36
The Literature Review
  • Primary Sources
  • Secondary Sources
  • Theoretical literature
  • Empirical literature
  • Integrative reviews (Evidence Based Research)
  • www.best4health.com/
  • www.cochrane.org/
  • www.guideline.gov
  • http//www.cebm.utoronto.ca/resources/websites.htm
  • www.ahcpr.gov/clinic/
  • http//www.crd.york.ac.uk/crdweb/

37
Definition of a Literature Review
  • A systematic and explicit approach to the
    identification, retrieval, and bibliographical
    management of independent studies locating
    information synthesizing developing
    guidelines

38
Purposes of the Lit. Review
  • Facilitate development of the Conceptual
    Framework by summarizing knowledge
  • Clarify the research topic
  • Clarify the research problem
  • Verify the significance of the research problem
  • Specify the purpose of the study
  • Describe relevant studies or theories
  • Develop definitions of major variables
  • Select a research design, data measurement, data
    collection analysis, interpret findings

39
Literature Searches
  • Ebscohost with CINAHL http//search.ebscohost.co
    m
  • Log in DSN
  • Password evidence

40
Research Process
  • Research is a systematic, diligent inquiry that
    is necessary to address
  • What needs to be known-what is the question,
    hypothesis, or interest area
  • What research methods are needed to examine this
    question or phenomena
  • What meaning can be extracted from the study
    through data analysis to build our knowledge base
    of that subject
  • Generate outcomes and disseminate new knowlege

41
Understanding Research Design
  • Can have confusing terms
  • Research Methodology
  • The entire process from question to analysis
  • Research Design
  • Clearly defined structures within which the study
    is implemented
  • Is a large blueprint, but must be tailored to the
    study and then mapped out in detail

42
Numbers and Use of Numbers
  • Nominal (qualitative)
  • A Named category given a number for convenience,
    e.g. males are 1 and females are 2
  • Ordinal (qualitative)
  • A scale that is subjective but shows a direction,
    e.g. pain scale, cancer staging
  • Interval (quantitative)
  • Numbers where the interval between them is
    meaningful, e. g. a temperature
  • Ratio (quantitative)
  • Numbers where the ratio to each other has
    meaning, e. g. a pulse, heart rate.

43
Quantitative Designs
  • What are the four types of
  • Quantitative Designs?

44
Quantitative Designs
  • Experimental
  • Quasi-experimental
  • Descriptive
  • Correlational
  • Aim to describe, compare, and predict in order to
    understand or control phenomena

45
Quantitative Designs
  • What characterizes true Experimental Research
    Designs?

46
True Experimental Research Designs
  • Are characterized by
  • Random assignment of subjects to groups
  • Comparison of treatment group(s) with a
  • Control or business as usual group

47
True Experimental Research Designs (cont.)
  • Also characterized by
  • Strict control of extraneous variables
  • to obtain true representation of cause
  • and effect
  • Note use causality language with caution!!!
    (there is always a P-value)
  • Ex Smoking and cancer

48
Randomized Controlled Clinical Trials (RCT)
  • True Experimental Design
  • Large N ( of subjects)
  • Draw subjects from reference population
  • Randomly assign subjects to treatment/experimen
    tal or control group
  • Examine for baseline equivalence
  • Multiple sites used for generalizability

49
Quasi-Experimental Research Designs
  • Are characterized by
  • Treatment or intervention
  • Comparison of treatment group(s) with a
  • control or business as usual group
  • Non-equivalence of groups--not randomly assigned
    group assignment often evolves naturally?
    convenience sampling)
  • Ex Pts. on one unit compared to pts. on another

50
Quasi-Experimental Research Designs (cont.)
  • Also are characterized by
  • Aiming to represent cause and effect in
    situations where less control over variables
    exists
  • Most frequently used design in nursing

51
Correlational Designs
  • Descriptive correlational designs
  • Used to describe variables and to examine
    relationships between or among variables
  • Predictive correlational designs
  • Used to predict value of one variable based on
    values obtained for another variable
  • Independent variable used to predict Dependent
    variable ? Regression
  • Model-testing design
  • Looks at relationships among a of variables

52
Correlational Designs
  • Descriptive correlational designs
  • Used to describe variables and to examine
    relationships between or among variables
  • Predictive correlational designs
  • Used to predict value of one variable based on
    values obtained for another variable
  • Independent variable used to predict Dependent
    variable

53
Quantitative Design Concerns
  • Primary purpose (check question)
  • Is there a treatment (intervention)
  • Will the treatment be controlled
  • Is there a control (untreated) group
  • Is there a pre or post test (or both)
  • Is sample random
  • Will sample be a single group or divided into
    several groups

54
Quantitative Design Concerns-2
  • How many groups will there be
  • What is the size of each group
  • Will groups be randomly assigned
  • Will there be repeated measurements over time or
    will the data be collected cross-sectionally at
    one or two points in time
  • Have extraneous variables been identified and
    controlled for
  • What strategies are being used to compare
    variables or groups

55
Research Question Considerations
  • Ethics
  • Significance
  • Motivation
  • Qualifications
  • Feasibility

56
Hypotheses and Research Qs
  • Hypotheses Intelligent guesses about predicted
    relationships
  • Problem statement ? what the issue/concern/problem
    is and why it should be addressed
  • Research Qs Burning question

57
What are Criteria for Hypotheses?
  • Declarative
  • Written in present tense
  • Include population
  • Identify variables
  • Reflect the problem/concern
  • Are empirically testable

58
Types of Hypotheses Simple Complex
  • Simple
  • One Independent Variable (IV) and one Dependent
    Variable (DV)
  • Complex
  • Two or more IVs, two or more DVs, or
  • both, being investigated at same time

59
Types of hypotheses Directional Non-directional
  • Directional propose relationship and direction
    (e.g., Treatment group will have/be greater than,
    more than, higher than )
  • Non-directional propose relationship but not
    direction

60
Name that Hypothesis 1
  • Average length of gestation is shorter for
    infants of mothers who use cocaine than among
    mothers who use alcohol during the last six
    months of pregnancy.
  • Population? IV? DV?
  • Simple or complex? Directional?

61
Name that Hypothesis 2
  • The greater the degree of sleep deprivation, the
    higher the anxiety levels of intensive care unit
    patients.
  • Population? IV? DV?
  • Simple or complex?
  • Directional? Non-directional?

62
Name that Hypothesis 3
  • The total wt. loss of overweight elementary
    students who follow a reduced calorie diet and
    exercise 20 four times a week will be greater
    than those students who do not follow a reduced
    calorie diet and do not exercise 20 four times a
    week.
  • Population? IV? DV?
  • Simple or complex?

63
Name that Hypothesis 4
  • The degree of stress reported by flight-for-life
    nurses is greater than the degree of stress
    reported by ICU nurses.
  • Population? IV? DV?
  • Simple or complex?

64
Name that Hypothesis 5
  • More domestic violence and levels of anger are
    reported by veterans who served in the military
    in Iraq compared to those in the military who
    served in Afghanistan.
  • Population? IV? DV?
  • Simple or complex?

65
Sample of Research Topic Q.
  • Topic Adolescent sexuality
  • Problem statement (e.g., pregnancy rates in US
    are much higher compared to most West. countries)
  • Research Q
  • Will high school adolescent males report higher
    levels of comfort with their own sexuality than
    will females?
  • Hypothesis
  • Adolescent males in grades 9 12 will report
    higher levels of comfort with their own sexuality
    than will females.

66
Quantitative Design Concerns
  • Primary purpose (check question)
  • Is there a treatment (intervention)
  • Will the treatment be controlled
  • Is there a control group (untreated)
  • Is there a pre or post test (or both)
  • Is the sample a random sample
  • Will the sample be a single group or divided into
    several groups

67
Quantitative Design Concerns-2
  • How many groups will there be
  • What is the size of each group
  • Will groups be randomly assigned
  • Will there be repeated measurements
  • Will the data be collected cross-sectionally or
    over time
  • Have extraneous variables been identified and
    controlled for
  • What strategies are being used for comparison of
    variables or groups

68
Study Validity
  • Def It is an examination of the approximation of
    truth or falsity of the propositions
  • Statistical Validity
  • Internal Validity
  • Construct Validity
  • External Validity
  • (Cook and Campbell, 1979)

69
Statistical Validity Errors
  • Violate assumptions about the data
  • Nominal, ordinal, interval, ratio data
  • Type I and Type II errors
  • Need for Power Analysis
  • Predicts the necessary N value
  • Inappropriate use of certain statistics for the
    various types of data
  • Random irrelevancies in setting
  • Random heterogeneity of respondents

70
Statistical Conclusion Validity Type I and Type
II Errors
  • Accept the Null Hypothesis
    Reject the Null Hypothesis
  • Reality is Type I Error
  • No Wanted There is no difference
  • difference caused by fishing
  • Reality is Type II Error, there is
  • There is a difference often caused Wanted
  • Difference by a low N value

71
Internal Validity
  • Definition
  • It is the extent to which the effects detected
    in the study are a true reflection of reality
    rather than the result of extraneous variables
  • The independent variable did have an impact
    on the dependent variable

72
Threats to Internal Validity
  • History Natural events over time impacting the
    subjects
  • Maturation A persons growth in any area
    impacting his/her response
  • Testing effect caused by subjects remembering
    previous testing
  • Instrument reliability of treatment
  • Selection process (randomized)
  • Mortality threat
  • Interaction with subjects
  • No equalization of treatment

73
External Validity
  • Definition
  • To provide development of the design that allows
    it to be generalized beyond the sample used in
    the study.
  • Most serious threat is that it can only be said
    of the group being studied

74
Threats to External Validity
  • Small N
  • No randomization when it is needed
  • Special probability sampling
  • Simple (numeric randomization)
  • Stratified (2 or more strata of the pop)
  • Clustered (random groups vs individuals)
  • Systematic (every nth person)
  • Single vs. replicated

75
Factors Influencing Sample Size
  • Effect Size
  • The degree to which the phenomenon is present in
    the population or to which the null hypothesis is
    false.
  • It is hard to detect an effect from an
    intervention if the sample is small
  • Type of study conducted
  • Case study, phenomenology, experimental,
    Descriptive

76
Factors Influencing Sample Size
  • The number of variables
  • This requires a power analysis to determine the
    necessary N
  • Measurement Sensitivity
  • The ability of the measurement to find what it
    thinks it is finding.
  • Data Analysis Techniques
  • The various statistics can impact the number of
    subjects needed.

77
Types of Probability Sampling
  • Simple Random Sampling
  • Stratified Random Sampling
  • Cluster Sampling
  • Systematic Sampling
  • Random Assignment to Groups

78
Types of Nonprobability Sampling
  • Convenience (Accidental) Sampling
  • Quota Sampling
  • Purposive Sampling
  • Network Sampling
  • Theoretical Sampling

79
Theoretical Sampling
Purposive Sampling (Non-Randomized)
Theoretical Sampling
Convenience Sampling
80
Problem Statements-Questions
  • The problem dictates the design
  • What is experience of police officers who were
    wounded in the line of duty related to their
    ability to return to work?
  • What are the unique features of Hospitals that
    have NP conducting all surgical admission
    assessments?
  • There is (is no) statistically significant
    difference in iatrogenic diseases between nurse
    to patient ratios of 15 vs 18 on General
    Medical Units.
  • Does the birthing center philosophy show a
    relationship to the type of care provided and if
    so, what is the relationship.

81
Problem Revisions
  • I am curious about the standardized treatment
    protocols for circumcision of a new borne.
  • NEXT REVISION
  • NEXT REVISION
  • NEXT REVISION
  • NEXT REVISION

82
Dependent-Independent Variables
  • The intervention Independent variable
  • The impact, outcome dependent variable
  • Find I and D variables
  • Jan is studying the effectiveness of using a
    Multiple Intelligence course on nurse managers
    for increasing managerial proficiency
  • Jean is wondering if adding a course on Conflict
    Resolution and Power will change nurses related
    to their ability to deal with day to day conflict
    on units.
  • Jay believes that adding contract grading and a
    student selected project will increase student
    learning

83
Dependent-Independent Variables continued
  • John has been questioning if self-esteem in
    teenagers is impacted by grouping students
    according to gender.
  • Judy believers that the taillight not working is
    either caused by the bulb or the connection.

84
Caution Areas
  • You see what you look for
  • You look for what you know
  • Appropriate statistical strategies for certain
    types of numbers

85
Dealing With Data
  • Developing Data Collection Forms
  • Planning Data Collection Process
  • Planning he Organization of Data
  • Planning Data Analysis
  • Planning Interpretation Communication of
    Findings
  • Evaluation of the Plan

86
Data Collection Tasks
  • Recruiting Subjects
  • Maintaining Consistency
  • Maintaining Controls
  • Protecting Study Integrity
  • Problem-Solving

87
Data Collection Problems
  • People Problems
  • Researcher Problems
  • Institutional Problems
  • Event Problems
  • Measurement Validity
  • Measurement Reliability

88
Computer Support for Data
  • Data Input
  • Data Storage
  • Data Retrieval
  • Statistical Analysis

89
Numbers and Use of Numbers
  • Nominal (qualitative)
  • A Named category given a number for convenience,
    e.g. males are 1 and females are 2
  • Ordinal (qualitative)
  • A scale that is subjective but shows a direction,
    e.g. pain scale, cancer staging
  • Interval (quantitative)
  • Numbers where the interval between them is
    meaningful, e. g. a temperature
  • Ratio (quantitative)
  • Numbers where the ratio to each other has
    meaning, e. g. a pulse, heart rate.

90
Bivariate Data Analysis Independent Groups
  • Nominal Data
  • Chi squared (Two or more samples)
  • Phi (Two samples)
  • Cramers V (Two samples)
  • Contingency Coefficient (Two samples)
  • Lambda (Two samples)

91
Bivariate Data Analysis Independent Groups
  • Ordinal Data
  • Mann-Whitney U
  • Kolmogorov-Smirnov (two-sample test)
  • Wald-Wolfowitz Run Test
  • Spearman Rank-Order Correlation
  • Kendalls Tau
  • Kruskal-Wallis One-Way Analysis of Variance by
    Rank (three or gt samples)

92
Bivariate Data Analysis Independent Groups
  • Interval or Ratio Data
  • t Test for independent samples
  • Pearsons Correlation
  • Analysis of Variance (Two or more samples) ANOVA
  • Simple Regression
  • Multiple Regression Analysis (two or more samples)

93
Bivariate Data Analysis Dependent Groups
  • Nominal Data
  • McNemar Test
  • Cochran Q Test (three or more samples)
  • Ordinal Data
  • Sign Test
  • Wilcoxon Matched-pairs, Signed-Ranks
  • Friedman Two-Way Analysis of Variance by Ranks
    (for three or more samples)

94
Bivariate Data Analysis Dependent Groups
  • Interval or Ratio Data
  • t Test for Related Samples
  • Analysis of Covariance (for three or more
    samples) ANCOVA

95
Multivariate Data Analysis
  • Interval or Ratio Data
  • Multiple Regression Analysis
  • Factorial Analysis of Variance
  • Analysis of Covariance
  • Factor Analysis
  • Discriminate Analysis
  • Canonical Correlation
  • Structural Equation Modeling
  • Time-Series Analysis

96
Types of Measurement Validity
  • Content-related Validity
  • Validity from Factor Analysis
  • Validity from Contrasting Groups
  • Validity from Examining Convergence
  • Validity from Examining Divergence
  • Validity from Discriminant Analysis
  • Successive Verification of Validity

97
Physiological Measures Reliability and Validity
  • Accuracy
  • measurement that has the most precise identifiers
    for the level of measurement sought
  • Selectivity
  • the ability to identify that which is really want
    to sometimes called specificity
  • Precision
  • the amount of reproducibility in measurement
  • Sensitivity
  • The amount of a changed parameter that can be
    detected
  • Sources of Error

98
Working with Descriptive Data A Toolkit for
Health Care Professionals
  • Correlational Descriptive
  • Predictive Descriptive
  • Model Testing Descriptive

99
Overview
  • Background
  • Types and Uses of Tools
  • Examples
  • Questions/Answers

100
Statistics vs. Tools
  • Inferential Statistic Analysis
  • Statistics (regression, correlation, etc.)
  • Descriptive Statistic Analysis
  • Tools to display information

101
Tools for Planning
  • Critical Paths
  • Force Field Analysis
  • Descriptive Methods

102
Critical Path Process
  1. Select the process
  2. Define the process
  3. Form a team
  4. Create the critical path
  5. Make the path a working document

103
Critical Pathway for Complaints of Chest Pain in
ED
104
Force Field Analysis
  • Driving Forces
  • (support efforts)
  • ?
  • Restraining Forces
  • (conflict with efforts)
  • ?

105
Force Field Analysis
Driving Issues for Moving Minimum Grade at DSN
From 72 to 74
  • Driving Forces
  • (support efforts)
  • Comparable to Other Schools
  • Recent drop in NCLEX rates
  • Faculty requests
  • ?
  • Restraining Forces
  • (conflict with efforts)
  • Significant Change in Policy
  • More students would fail
  • DSN had 90-94 NCLEX rates with 72
  • ?

106
Descriptive Statistics as Selection Grids (also
called Prioritization Matrices)
  1. Start with list of options
  2. Choose criteria scoring system
  3. Draw the grid
  4. Judge each option against criteria write in
    scores
  5. Use completed grid to evaluate findings
  6. Determine whether new criteria are necessary
  7. Select the best option

107
Indicators
  • Quantitative measures
  • Related to one or more dimensions of performance
  • Help provide data that (when analyzed) give
    information about quality
  • Direct attention to potential problems

108
Types of Indicators
  • Sentinel-event indicators
  • Serious injury or death indicator
  • Aggregate-data indicators
  • Rating for med errors and patient complaints
  • Continuous-variable indicators
  • Number of new bed sores per day
  • Rate-based indicators
  • Infections per 1000 patient days

109
Run Charts
  • Probably most familiar/used tool
  • Used to identify trends/patterns in a process
    over time
  • Helps track if target level has been
    attained/maintained

110
Run Chart Trend Chart Used for Self Comparison
Quarterly report of new bed sores for Unit X 2008
111
Comparison Run Charts Trend Charts-(Dangerous
because these are not ratio numbers)
Quarterly report of new bed sores for Units A,
B, X for 2008
112
Histograms
  • Bar charts that display
  • Patterns of variation
  • The way measurement data are distributed
  • Snapshot in time
  • May be more complex to establish consult
    statistics textbook if needed

113
Comparison Run Charts Trend Charts-(Dangerous
because these are not ratio numbers)
Quarterly report of new bed sores for Units A,
B, X for 2008
114
Comparison Run Charts Trend Charts for Delta
Hospital
Quarterly report of new bed sores per 1000
patient days for Units A, B, X for 2008.
115
Control Chart
This is the control chart for infections from
I.V.s on Unit X With 3 case per 1000 patient days
as the std for 2008.
0.005 Jan Feb Mar Apr May Jun
Jul Aug Sep Oct Nov Dec x x
x x x x
x x 0.003 x x
x

x 0.000
Max.
Std.
Min.
116
Pie Charts
  • Descriptive data
  • Shows a distribution by category
  • Compared to the Whole

117
Pie Distribution of new bed sores for
hospitalized patients at Delta Hospital
Total of 140 new bed sores reported in 2008
36
43
37
118
Scatter Diagrams
  • Graphs that show statistical correlation between
    2 variables
  • Used when group wants to
  • Test a theory
  • Analyze raw data
  • Monitor an action taken

119
Scatter Diagram Process
  • Min. Program Passing rates in

76

74
72
NCLEX Scores by
100
120
Surveys
Surveys can carry a risk to them. Also know what
Likert Scale you are using and why (1-4, 1-5,
1-10 most common).
121
Naturalistic InquiryQualitative Research Methods
122
Observational Measurement
  • Unstructured
  • Structured
  • Category Systems
  • Checklists
  • Rating Scales
  • Emic (from within)
  • Etic (from external view point)

123
Phenomenology Research The Lived Experience
  • Phenomenology is a science whose purpose is to
    describe the appearance of things as a lived
    experience.
  • It allows nursing to interpret the nature of
    consciousness in the world.
  • It can be descriptive or interpretive
    (hermeneutic).
  • It is a philosophy, an method, and an inductive
    logic strategy

124
Different Types of Phenomenology
  • Phenomenology of Essences
  • Experiment with relationships
  • Coding by categories
  • Using free imaginative variation
  • Phenomenology of Appearances
  • Focuses on phen. as it unfolds-takes shape
  • Sense of dynamic adventure with the world
  • Reductive Phenomenology
  • Is a constant work of the self related to bias,
    etc
  • Interpretive (Hermeneutic) Phenomenology
  • To interpret the phenomena being observed

125
Design Characteristics
  • Purposive samples of 7-20 usually going for
    saturation.
  • Instrument is the researcher
  • Data collection is by interview of groups or
    individual that are verbatim, taped, and field
    notes.
  • Data collection is directly tied to analysis,
    that eventually is coded or structured into
    themes.

126
Five Steps of the Method
  • Shared Experience is presented
  • Transform the lived experience into an experience
    the subject would agree with
  • Code the data
  • Put it into written form and create confirmation
    of the data texts.
  • Create a complete integration of all of these for
    a research document

127
The Rigor of Trustworthiness
  • Trustworthy questions
  • Trustworthy approach
  • Trustworthy in analysis
  • Trustworthy and authenticity of data

128
Other Research Rigors
  • Descriptive Vividness
  • Methodological Congruence
  • Theoretical Connectedness
  • Analytical Preciseness
  • Heuristic Relevance

129
Defining Naturalistic Rigor Standards 1 and 2
  • Descriptive vividness
  • narratives are texturized, thick, and full of
    details
  • the writer shows connections and level of
    membership
  • Methodological congruence
  • details of exactly how the data is gathered with
    ethical rigor. Does the method match the design?

130
Defining Naturalistic Rigor Standards 3, 4 and 5
  • Analytical preciseness
  • the data is transformed across several levels of
    abstraction
  • moving raw data to clusters, interpretations, or
    theory
  • Theoretical connectedness
  • ensuring the theoretical schema is clear and
    related to the data being collected and a lens
    for analysis
  • Heuristic relevance
  • readers must recognize the phenomenon as
    applicable, meaningful, recognizable

131
Unique Features of Phenomenology
  • Most of the literature review is conducted at the
    end of the data collection. It is believed the CF
    biases the data collection and analysis.
  • Like Grounded Theory but without a BSP or bias
    already in mind.
  • It is conducted by gathering interview data from
    others.
  • It is never quantitative, but some would prefer
    to try and keep it objective.

132
Ethnography Research
  • Defined as
  • Learning from People
  • By Spradley

133
Four Types of Ethnography
  • Classical
  • Years in the field, constantly observing and
    making sense of actions. Includes description and
    behavior. Attempts to describe everything bout
    the culture.
  • Systematic
  • Defines the structure of a culture.
  • Interpretive (hermeneutic)
  • To study the culture through inference and
    analysis looking for why behaviors exist.
  • Critical
  • Relies on critical theory. Power differentials,
    who gains and who loses, what supports the status
    quo.

134
Historical Roots
  • Early 1900s had several introductions
  • Herodotus wrote about travel in Persia
  • Malinowskis Study of Trobriand Islanders
  • Hans Stade wrote about his being in captivity by
    the wild tribes of Eastern Brazil
  • The School of Sociology in Chicago, where the
    city was a laboratory from all the immigrants
    (dancers, muggers, case studies)

135
Observation Methods
  • Emic
  • From within the research itself as a member or
    participant of some type.
  • Etic
  • From the outside looking in like a camera. It can
    be a peripheral issue or external observer
    member.

136
Fundamental Constructs
  • Is usually etic on the outside like a camera
  • Sometimes they are emic, on the inside as one
    of the actors (more in sociology)
  • Researcher is the instrument
  • Fieldwork is where the work occurs
  • Focus is on culture
  • Involves cultural immersion
  • There is a tension and reflexivity between the
    researcher as a member or researcher as researcher

137
Stages of Ethnography
  • Participant observation (gain access, rapport,
    trust)
  • Descriptive observation (9) (space, actors,
    activities, objects, act, event, time, goal, and
    feelings)
  • Ethnographic record (field notes, verbatim, old
    records, amalgamate the information)
  • Domain analysis
  • Focused observation (what is now critical)

138
Stages in Ethnography-2
  • Taxonomic analyzing (categorize)
  • Componential analysis (components of the selected
    areas)
  • Discover cultural themes
  • Take a cultural inventory
  • Write up the ethnography

139
Rigors for Ethnography
  • Plausibility
  • It is very easy to accept as truth
  • Credibility
  • Not exactly self evident, so you look at sources
    of evidence
  • Thick Description
  • Writing in such detail as to know exactly what is
    going on.
  • We could use the Five Standards also

140
Sources of Errors
  • Personal reactivity
  • False inferences
  • Gaps in writing, remembering, and interpreting

141
Grounded Theory Research
  • Started by Glaser and Strauss in 1967
  • Used extensively in nursing research
  • Takes into account the concepts of George Herbert
    Mead (1934) regarding symbolic interaction
    theory- how we give meaning to situations, words,
    objects, symbols
  • Is very individualistic in meaning
  • Most often used to study areas which previous
    research exists

142
Steps in Grounded Theory are conducted
simultaneously
  • Observation
  • Collection of data
  • Organization of data
  • Review of additional literature
  • Forming theory from the data
  • Using Constant Comparative Analysis

143
Data Collection Methods Have qualitative and
quantitative properties
  • Interviews (one on one, groups)
  • Observation
  • Records (retrospective analysis)
  • Surveys (quantitative)
  • Questionnaires (could be quantitative)
  • Demographic data

144
Constructs of Grounded Theory
  • Conceptual framework comes from the data rather
    than the literature review
  • There is always an over-riding social issues
    being addressed called the Basic Social Process
    (BSP)
  • Researcher focuses on dominate processes rather
    than describing the setting, or unit
  • You compare all data with all other data

145
Constructs of Grounded Theory
  • You may change data collection methods in mid
    stream to be more appropriate to what has already
    been discovered
  • The researcher is to be doing most sequential
    tasks all at the same time

146
Constant Comparative Analysis
  • Get data, look at it, look at the literature,
    look at previous data, go get more data, look at
    more literature, look at all the data, etc.
  • Revise the question, collection method, and keep
    collecting data, look at literature, compare to
    old data, etc.

147
Sampling Methods
  • Called Theoretical Sampling
  • Based on the current question
  • Add new groups to the sample based on what it is
    you have learned (may need more men in the
    sample, or more people over the age of 70, etc.)
  • The sample being used moves as the theory
    develops

148
Theoretical Sampling
Purposive Sampling (Non-Randomized)
Theoretical Sampling
Convenience Sampling
149
Coding the data
  • Look for positive AND negative cases related to
    your social process
  • Step One read, describe, and interpret
  • Step Two constant comparison and clustering
  • Step Three reduce it to a BSP

150
Conducting Grounded Theory
  • Be aware of the social life of the participants
  • Make less assumptions in the beginning
  • Sensitizing to the literature, Bracket if needed
  • Layers of reality are explored, assess your own
    energy to go further
  • Spend enough time with participants and data
  • Be observant to how the participants are doing
  • Learn the symbols being used to create this
    reality
  • Sample across time

151
Case Studies from Stake (2000) and Yin (1994)
  • These are OBJECT or METHOD issues
  • Object Has to do with what you want to study not
    an approach to how to study it
  • Method Can be quantitative or qualitative method
    (analytically, vs. holistically)
  • Questions are aimed at How or Why(rarely
    What)
  • Single or multiple cases-usually1or 2

152
Case Studies-Advanced (Stake, 2000)
  • Intrinsic Study
  • Single case with in depth and complete
    understanding
  • Instrumental Study
  • Single case, but exploring various factors of
    this case to relate them to theory or to a
    question-more a breadth issue
  • Collective Study (Complex)
  • Examining several studies as either comparison or
    progressive support for theory or a premise

153
Case Studies-Examples (Stake, 2000)
  • Intrinsic Study
  • The Education of Henry Adams An Autobiography
  • The Swedish School System
  • Instrumental Study
  • Campus Response to a Student Gunman
  • A Nonreader Becomes a Reader A Case Study of
    Literacy
  • Collective Study (Complex)
  • Teachers Work
  • The Dark Side of Organizations Mistake,
    Misconduct and Disaster

154
Purpose of Case Studies
  • Seeks the unique features (particular) while also
    describing the common by describing
  • The nature of the case
  • The cases history and background
  • The physical setting
  • Other contexts (economics, political, legal,
    aesthetic issues)
  • Other cases through which this case is recognized
  • Through the informants by which the case is known
  • Examine changes across time (multiple case)
  • Same group of different group

155
Case Study Rigor
  • Yin (1994) treats this as a positivistic
    activity, therefore
  • Construct, Internal, and external validity
  • Reliability
  • This is not just a pilot study for quasi- or full
    experimental designs. It is different.
  • Stake (2000) treats it more naturalistic
  • Thick description is key
  • Auditability (can it be followed by the reader)

156
Observational Measurement
  • Unstructured
  • Structured
  • Category Systems
  • Checklists
  • Rating Scales
  • Emic (from within)
  • Etic (from external view point)

157
Interviews
  • Unstructured
  • Structured
  • Describing interview questions
  • Pretesting the interview protocol
  • Training interviewers
  • Preparing for an interview
  • Probing
  • Recording interview data
  • Coding methods

158
Triangulation Blended Designs
  • First used by Campbell and Fiske in 1959.
  • Denzin in 1989 identified four different types.
  • Data Triangulation
  • Investigator triangulation
  • Theoretical triangulation
  • Methodological Triangulation
  • Kimchi, Polivka, and Stevenson (1991) have
    suggested a fifth type
  • Multiple Triangulation

159
Data Triangulation
  • Collection of data from multiple sources
  • Intent is to obtain diverse views of the same
    phenomenon. (Longitudinal is different and is
    looking for change)
  • Validate data by seeing if it occurs from
    different sources

160
Investigator Triangulation
  • Two or more investigators with different research
    backgrounds examining the same phenomenon
  • Clarifies disciplinary bias
  • Adds to validity of data

161
Theoretical Triangulation
  • Using all the theoretical interpretations that
    could conceivably be applied to a given area
  • Each view is critically examined for utility and
    power
  • Increased the confidence of the hypothesis
  • Can lead to even greater T. F. beliefs

162
Methodological Triangulation
  • The use of two or more research methods in a
    single study
  • Design level
  • Data collection level
  • Two major types
  • Within-method (all are one philosophy)
  • Across-method (across philosophies)

163
Pros and Cons of Triangulation
  • Very trendy in the 90s
  • Can be used with smaller N
  • Combined methods may just be the rise of a new
    method
  • There are philosophical risks
  • Complex designs and therefore complex analysis

164
Action Research
  • A systematic investigation conducted by
    practitioners involving the use of scientific
    techniques in order to improve their performance.

165
The beautiful thing about learning is that
nobody can take it away from you. --BB
King US jazz musician
About PowerShow.com