Setting Pass Points - PowerPoint PPT Presentation

1 / 48
About This Presentation
Title:

Setting Pass Points

Description:

Jaeger. Bookmark. 21. MAC/Pass Point Setting Methods. Nedelsky Method ... Jaeger Method. SME judgments combined with actual SME test performance data. Bookmark Method ... – PowerPoint PPT presentation

Number of Views:179
Avg rating:3.0/5.0
Slides: 49
Provided by: CPS70
Category:
Tags: jaeger | pass | points | setting

less

Transcript and Presenter's Notes

Title: Setting Pass Points


1
Setting Pass Points
  • The Puzzle and the Practice

Presented by Shelley Langan Manager, Assessment
Services
2
Session Overview
  • This seminar will focus on the value and
    importance of setting defensible, job-related
    pass points on employment tests

3
Session Objectives
  • Provide an understanding of the major issues and
    trends to consider when setting and using pass
    points
  • The significance of pass points in employment
    testing
  • Legal and professional standards for
    setting pass points
  • Recognized methods of setting pass points

4
Session Objectives (cont.)
  • Objectives continued
  • The use of test statistics in setting pass
    points
  • The differences between norm-referenced and
    criterion-referenced pass points
  • Determining and using MAC (minimal acceptable
    competence) levels as the basis for pass
    points

5
Significance of Pass Points
  • Pass points are versatile
  • Are thresholds and points of selection
  • Can signify competency
  • Can identify who is successful and who continues
    in the testing process
  • Can help in determining who is considered for
    selection/appointment
  • Can strengthen and add validity to the testing
    process

6
Significance of Pass Points (cont.)
  • Questions -
  • Is it important how pass points are set?
  • Will our pass points ever come under scrutiny?
  • Could we defend our pass points?
  • How?
  • On what basis?

7
Significance of Pass Points (cont.)
  • Answers -
  • Is it important how pass points are set?
  • Yes, yes, yes and why??
  • Will our pass points ever come under scrutiny?
  • Maybe do we know in advance?
  • Do we know when were setting them that they
    will/may be questioned?
  • Could we defend our pass points?
  • To be defensible, pass points should be
    job-related and based on a rationale of job
    competency

8
Legal and Professional Considerations
  • Professional standards
  • Trends of the courts

9
Professional Standards
  • Uniform Guidelines on Employee Selection
    Procedures
  • The Standards for Educational and Psychological
    Testing
  • The Principles for the Validation and Use of
    Personnel Selection Procedures

10
Professional Standards (cont.)
  • When cutoff scores are used, they shouldbe set
    so as to be reasonable and consistent with normal
    expectations of acceptable proficiency within the
    work force
  • The process by which cut scores are determined
    should be clearly documented and defensible
  • There is no single method for establishing
    cutoff scores

11
Professional Standards (cont.)
  • Cutoff scores may be set as high or as low as
    needed to meet requirements of the
    organization
  • Cutoff scores should be based on professional
    judgment and a rationale to meet organizational
    need(s)
  • If use of cutoff score eliminates candidates,
    the rationale for the cutoff score should be
    documented

12
Legal and Professional Considerations
  • Professional standards
  • Trends of the courts

13
Trends of the Courts
  • Board of Regents of the University of the State
    of New York v. Tomanio (1980)
  • Examinations are a permissible method of
    determining qualifications, and lines must be
    drawn somewhere.
  • Justice Stevens

14
Trends of the Courts (cont.)
  • Tendency to uphold pass point if selection
    procedure is valid
  • Have acknowledged the rationale that pass points
    be based on job requirements rather than
    incumbent performance
  • Have accepted pass points set higher than
    incumbent performance levels if difference
    based on requirements of the job

15
Trends of the Courts (cont.)
  • Have established that pass points should
    differentiate between those who can do the job
    and those who can not
  • Have noted that arbitrarily low pass points
    tend to destroy the credibility of the testing
    process

16
Methods of Setting Pass Points
  • Norm-referenced methods
  • Criterion-referenced methods

17
Methods of Setting Pass Points
  • Norm-referenced methods
  • Based on candidate group performance on the
    test and test statistics
  • May or may not be indicative of competency
  • Consider
  • Test statistics
  • Size of candidate group
  • Number of vacancies
  • Adverse impact

18
Methods of Setting Pass Points (cont.)
  • Criterion-referenced methods
  • Based on an established standard
  • Use a MAC level as the starting point in
    conjunction with test statistics and selection
    need
  • Consider
  • Size of candidate group
  • Number of vacancies
  • Adverse Impact
  • Typically related to job proficiency and
    qualification requirements

19
MAC Level
  • MAC Minimal Acceptable Competence represents
    (or is intended to) the level of performance on
    the test/selection procedure indicative of
    minimal competency
  • Bare minimum bottom of the qualified barrel
  • Not best or most qualified

20
MAC Level for Written Tests
  • Widely recognized methods for establishing
    MAC levels
  • Angoff the most widely used method
  • Modified Angoff
  • Nedelsky
  • Ebel
  • Jaeger
  • Bookmark

21
MAC/Pass Point Setting Methods
  • Nedelsky Method
  • Obvious distracters eliminated
  • Results in modified chance score for each item
  • Ebel Method
  • Items categorized by relevancy and difficulty
  • Item categories evaluated based on probable MAC
    candidate performance

22
MAC/Pass Point Setting Methods (cont.)
  • Jaeger Method
  • SME judgments combined with actual SME test
    performance data
  • Bookmark Method
  • SME judgments combined with actual candidate
    test performance data

23
Pass Point Setting Methods (cont.)
  • Angoff Method
  • Item performance probability determined
  • Item probabilities summed

24
Angoff Method
  • View 1

25
Angoff Method
  • View 2 Indicate item probabilities

26
Angoff Method
  • View 3 Sum the item probabilities

Result expected performance level 3.51 (raw
pts.)
27
Pass Point Setting Methods (cont.)
  • Modified Angoff Method
  • Item necessity and difficulty levels determined
  • Item performance probability determined
  • Results calculated
  • ----------
  • Combination of the Angoff and Ebel methods

28
Modified Angoff Method
  • View 1 Item Ratings
  • Item Necessity Rating
  • To what extent is the behavior (the knowledge,
    skill, or ability) measured by this item
    necessary for job performance?
  • 3 Essential or Critical
  • 2 Important
  • 1 Useful
  • 0 Not Necessary

29
Modified Angoff Method
  • View 1a Item Ratings
  • Item Difficulty Rating
  • How difficult is this item compared to the
    difficulty of the behavior required on the job?
  • 5 Considerably Harder
  • 4 Slightly Harder
  • 3 Appropriate
  • 2 Slightly Easier
  • 1 Considerably Easier

30
Modified Angoff Method
  • View 2 SME Rating Process
  • Step 1 Rate the items necessity
  • Step 2 Rate the items difficulty
  • Step 3 Determine the probability of success on
    the item by the MAC candidate
  • Step 4 Move on to the next item and
    determine ratings

31
Modified Angoff Method
  • View 3 Rating the Items

32
Modified Angoff Method
  • View 4 SME Rating Data

33
Modified Angoff Method
  • View 5 Computing the Results
  • Calculate the ND values enter in ND column
  • Multiply ND value by probability enter
    in MAC column
  • Sum ND column (41)
  • Sum MAC column (28.30)
  • Divide MAC sum by ND sum (28.30 41 69.02)

Result expected performance level 69.02
(percentage)
34
Modified Angoff Method
  • View 6 Expressing MAC in Raw Points
  • Multiply the MAC level ( value) by the number of
    items in the test
  • 69.02 5 items 3.45 points

35
Modified Angoff Method
  • Exercise
  • Calculating an SMEs MAC data

36
Collecting MAC Data
  • Identify SMEs
  • Number
  • Level
  • Diversity
  • Define MAC and a MAC candidate
  • Explain/discuss MAC method to be used
  • Allow SMEs to provide ratings
  • Calculate each SMEs results
  • Analyze data
  • Establish MAC level

37
Collecting MAC Data (cont.)
  • Hints and Tips
  • Can collect data before and/or after test
    administration
  • Can collect in a group setting or
    individually from SMEs
  • Can use process to identify items to discard as
    final step in test construction process
  • Treat each test item as independent variable in
    process

38
Analyzing MAC Data
  • Compute each SMEs results
  • Calculate group mean
  • Check for outliers (high/low ratings)
  • Evaluate SME subgroup data (incumbents v.
    supervisors)
  • Establish final result MAC level!
  • ----------
  • Lets do this

39
Modified Angoff Method
  • Exercise
  • Calculating a MAC level based on multiple SMEs
    MAC data

40
Pass Point Setting Methodology
  • Norm-referenced
  • Criterion-referenced
  • Content-based
  • Utilize MAC level
  • Based on SME judgments

41
Setting a Pass Point
  • Norm-referenced
  • Based on candidate group performance on the
    test
  • With little (or no) knowledge of the caliber
    (i.e., job-related qualifications) of the
    candidates
  • Not based on a rationale of expected test
    performance
  • Will not necessarily result in a pass point that
    represents the threshold between those candidates
    who can do the job and those who can not

42
Setting a Pass Point
  • Criterion-referenced
  • Based on MAC level
  • Based on a rationale of expected test
    performance
  • Designed to result in a pass point that
    represents the threshold between those candidates
    who can do the job and those who can not

43
Setting a Pass Point
  • Criterion-referenced (cont.)
  • Start at the MAC level in the score distribution
  • Consider test statistics
  • Validity evidence/documentation for test
  • Reliability
  • Standard error of measurement
  • Mean Score
  • Standard deviation

44
Setting a Pass Point
  • Criterion-referenced (cont.)
  • Other considerations
  • Size of candidate group
  • Number of vacancies
  • Adverse impact
  • Previous pass points (from prior administrations
    of the test)
  • Determine best/most rational point to set as the
    pass point

45
Exercises
  • Lets set some pass points

46
Pre-Defined Pass Points
  • Result from the use of anchored rating scales
  • Usually Likert-type rating scales
  • 5-point scale
  • 7-point scale
  • 9-point scale
  • Scales anchored on the basis of demonstrated
    competency
  • Well Qualified Superior Excellent
  • Qualified Sufficient Acceptable
  • Not Qualified Unacceptable Not Ready

47
Pre-Defined Pass Points (cont.)
  • Usually stand as defined
  • Can be adjusted a point or two if circumstances
    warrant
  • Should not be adjusted without a rationale for
    doing so
  • Should maintain integrity of MAC definition
    (i.e., the pre-defined passing threshold)

48
Other Types of Pass Points
  • Used with point-method scoring
  • As with performance tests or experience
    ratings
  • Work with SMEs to determine MAC equivalent
    point total
  • Use as a starting point for setting pass point
  • Consider test stats., selection need, number of
    candidates, type of test instrument, adverse
    impact
Write a Comment
User Comments (0)
About PowerShow.com