ISO PC230 - PowerPoint PPT Presentation

1 / 32
About This Presentation
Title:

ISO PC230

Description:

... for the standards you are cross referencing: e.g. DIN33 for DIN 33430; EFTU for ... A single set of standards for evidence-based assessment with global ... – PowerPoint PPT presentation

Number of Views:64
Avg rating:3.0/5.0
Slides: 33
Provided by: daveba3
Category:
Tags: iso | din | pc230 | standards

less

Transcript and Presenter's Notes

Title: ISO PC230


1
ISO PC230
ISO/PC230 2007-05-30 N 0022
  • London Meeting 25/5/07
  • Marise Born, Anders Sjøberg, Lutz Hornke, Dave
    Bartram,

2
We invite you to
  • Review and comment on the initial set of slides
    (4 through 15). Use the Add Notes section on
    each slide to add your comments.
  • The remaining slides, the normative section,
    are also reproduced in an Excel sheet. Please add
    comments regarding each point to the Excel sheet
    and also provide cross-references to relevant
    standards or guidelines.
  • Please provide a short ID code for the standards
    you are cross referencing e.g. DIN33 for DIN
    33430 EFTU for EFPA Test User Guidelines
  • Finally, please review the Word document
    containing the references. Add references to any
    relevant guidelines or standards that are missing
    and make correction where required to any
    incorrect references. Please use Track Changes.

3
Scope agreed in Berlin 9/3/07 with suggested
clarification in red
  • Title Procedures and methods to assess people in
    work and organizational settings
  • Scope This standard contains requirements and
    recommendations for procedures and methods used
    to assess people in work and organizational
    settings.
  • They refer to
  • the selection, integration, implementation and
    evaluation of assessment procedures and methods
  • the interpretation of the assessment results and
    subsequent judgment reports
  • the requirements of the qualification of all
    individuals taking an active part in the
    assessment process
  • fairness and ethical principles in the process
  • personnel decisions to be made such as
    recruitment, selection, development, succession
    planning and reassignment.

4
The initial set of slides (4 through 15) propose
an outline framework for the standards
5
Preamble Two key issues
  • Who are the audiences for the standard?
  • Those involved in assessment of personnel in or
    for organizations were identified as the prime
    end users, together with others involved in the
    contracting of assessment services in work and
    organizational settings both clients and
    contractors
  • What is the function of the standard?
  • The standard should focus on service delivery and
    provide practical guidance for both clients and
    contractors regarding the nature and quality of
    the service the former should expect to receive.

6
Vision
  • A single set of standards for evidence-based
    assessment with global applicability that will
    enable
  • people to understand, develop and achieve their
    full potential at work.
  • organisations to become more effective through
    making best use of the potential of all their
    people.

7
Mission
  • To empower assessment users (internal and
    external contractors), those who are assessed
    (test takers, assessment centre candidates) and
    providers (developers, publishers) by
  • Increasing the quality of assessment in HR
  • Encouraging an evidence-based approach to
    assessment.
  • Encouraging the development and use of better
    assessment instruments
  • Ensuring equity and fairness in the use of
    assessment in and by organizations.
  • Enabling the stakeholders to realise the
    potential benefits of good assessment practice.

8
Stakeholders
  • The end users People who make ultimate use of
    the information collected during the assessment.
    This can include those making hiring decisions,
    such as line managers.
  • The people being assessed
  • Intermediaries
  • Policy makers (HR, Unions, external policy etc)
  • Internal organizational contractors (e.g. HR
    department assessors)
  • External contractors (e.g. assessment
    consultancy)
  • Those given delegated authority to carry out some
    assessment role or function (e.g. a test
    administrator)
  • Distributors of assessment procedures
  • Developers of assessment procedures

9
The standards focus on assessment of personnel in
work and organizational settings
  • They cover all stages of the employment life
    cycle, which is variously described in terms of
  • Recruitment and selection Career/vocational
    guidance, mid life career change, re-integration
    Personal development and coaching Promotion and
    succession planning Outplacement and retirement
    planning
  • Talent acquisition Talent management and Talent
    engagement
  • They cover assessment from the viewpoint of the
    needs/wants of the organization and the
    needs/wants of the individual.

10
Coverage includes assessment at individual, group
and organizational levels.
  • Assessment levels
  • Individual assessments for selection,
    development, performance appraisal etc
  • Large scale assessment for guidance, recruitment
    and selection
  • Group assessments as in assessment centres and
    development centres
  • Individual assessment, as for interviews, career
    guidance, in-depth senior executive coaching or
    personal development.
  • Team assessments - e.g. for improving team
    climate and performance
  • Organizational level assessments e.g. for
    managing culture change in mergers and
    acquisitions, employee satisfaction survey
    studies etc.
  • It includes both the measurement of people and of
    the fit within and between levels, for actual
    and preferred characteristics of people,
    jobs/roles, teams and organizations.
  • The measurement of fit involves the measurement
    of the characteristics of roles, teams and
    organizations, either perceived or preferred as
    well as the measurement of the characteristics of
    people.
  • Fit also involves issues of how to measure and
    represent the degree of fit or misfit

11
Assessment should be an evidence-based process.
  • This is a planned and structured process which
    has outcomes or consequences that can be measured
    and used to evaluate the outcome and the utility
    of the assessment.
  • Assessment could include measures of performance
    (varying from sales performance figures to
    ability test scores), self-report and
    other-report ratings and scaleable judgemental
    data (e.g. assessor or supervisor performance
    ratings etc).
  • It involves three stages
  • Pre-assessment contracting
  • Providing a clear rationale
  • Identifying what needs to be assessed and how
    choosing the criteria for evaluating success and
    having a clear expectation of the utility of the
    process (The organization needs to have
    understood what the possible outcomes are and
    what the costs/risks are.)
  • Agreeing a contract
  • Assessment delivery i.e. carrying out the
    assessments.
  • Post-assessment review and evaluation
  • Reviewing the delivery
  • Evaluating the outcome, the consequences and the
    utility of the assessment process.

12
Two key concepts here
  • Evidence based
  • Focus on outcomes and consequences

13
What is an evidence-base?
  • An evidence-based approach to assessment focuses
    on a broad view of validity as the key to quality
    of delivery.
  • The evidence base can be defined as the quality
    and relevance of the evidence (e.g. criterion or
    construct validity, case studies) supporting the
    use of an assessment procedure in a particular
    situation or for a particular use
  • This evidence provides the basis for the
    inferences that can be made from the assessment
    data and for the decisions that can be supported
    by it.

14
Anticipating outcomes and consequences
  • Alternative possible outcomes and consequences of
    an assessment process should be considered in
    advance, and the risks and utilities associated
    with them.
  • Unwanted consequences or impacts as well as the
    intended ones should be considered. example
    alienation of workforce through use of an
    assessment implemented for personal development
  • Some information relevant to this analysis should
    be contained in assessment documentation but
    applications in specific situations should be
    evaluated on a case by case basis.
  • The measures that are needed to evaluate the
    effectiveness of the assessment (stage 3) should
    be planned in advance.

15
Quality of assessment should be considered in
relation to six general criteria. These together
form the basis for equitable assessment.
  • Coverage of range or the scope of the assessments
    the assessments should cover the scope of the
    characteristics that need to be assessed.
  • Relevance measures used should be valid. They
    measure what they claim to measure and enable
    relevant inferences to be made.
  • Accuracy measures should have known levels of
    precision (i.e. psychometric reliability)
  • Freedom from bias they should not introduce
    irrelevant systematic sources of variance (e.g.
    impact of demographic differences in language
    skill on results of a personality assessment).
  • Acceptability they should be seen as
    appropriate, fair and reasonable by those
    involved in their use.
  • Practicality they should be fit for purpose
    in terms of cost, usability etc.

16
The following slides outline the areas requiring
normative statements in the standards
17
Assessment shall involve three stages
  • Pre-assessment contracting
  • Assessment delivery
  • Post-assessment review and evaluation
  • Each stage should be considered in terms of the
    six quality criteria Scope, relevance, accuracy,
    freedom from bias, acceptability and
    practicality.

18
1a Pre-Assessment Contracting
  • The contracting process should be agreed between
    client and contractor
  • A needs analysis should be carried out covering
  • Identification of assessment needs including
    demographics issues (e.g. range of languages to
    be covered, levels of literacy or numeracy,
    ethnic mix, gender issues, areas of disability)
  • Identification of areas requiring post-contract
    requirements analysis. That is, where the details
    require significant contractor effort (e.g.
    carrying out a job analysis or competency
    modelling prior to assessment)
  • Specification of outcome criteria. To be jointly
    defined by the client and contractor and
    criterion measurement procedures to be agreed.
  • Other stakeholder interests should be identified
    and relevant stakeholders consulted regarding
    issues of practicality and acceptability.
  • Ethical issues should be considered regarding
  • Competences of client and contractor staff
  • Conflicts of interest
  • Consent defining informed consent parameters
  • Consequences intended and unintended
  • Legal constraints and local restrictions on
    practice shall be considered, including
  • Data protection
  • Equal opportunity legislation

19
1b Pre-Assessment Contracting
  • Client and contractor shall reach agreement
    regarding
  • Deliverables
  • Time scales and milestones
  • Resources required on both client and contractor
    sides, including levels of expertise of personnel
  • Costs
  • Key person identification. Who has the particular
    skill sets that need to be used, what will happen
    if these people are not available?
  • Escalation procedures. How will the client raise
    issues and what will they do if they are not
    satisfied with the answer?
  • Risk management. What are the risks in the
    project and what can be planned as mitigation?
  • The contractor should manage client expectations
    regarding conduct, delivery, reporting and
    outcomes
  • In terms of outcomes, benefits and utility
  • Restrictions on the usability of the data by the
    client for purposes other than those contracted
  • The contract shall include
  • A description of assessment model single stage,
    multi-stage etc
  • Who does what (client and contractor roles and
    responsibilities)
  • How data will be integrated and used

20
2a. Assessment delivery - planning
  • Assessment delivery should include the following
  • Create detailed assessment specification
    (detailed requirements analysis, which may
    include job or situation analysis, competency
    requirement profiling, examination of historical
    data etc)
  • The range of demographics in the
    applicant/candidate pool should be considered
  • Managing exceptions procedures for dealing with
    people having language problems disabilities
    that might affect assessment provision for
    managing technology breakdowns etc
  • The choice of assessment methods and procedures
    should be finalised and agreed with client
  • Choosing appropriate instrument and procedures
  • Choosing appropriate personnel
  • People, materials and other logistics should be
    prepared
  • Resourcing in terms of people and materials
  • Ensuring competence of people and training of
    interviewers, assessors, raters, or
    administrators, as required.
  • Planning the assessment events and locations

21
2b. Assessment delivery - communication
  • Procedures should be communicated to people who
    are to be assessed including
  • Support and grievance procedures
  • Gaining informed consent, the terms and
    conditions under which people are being assessed
    and the psychological contract between them and
    the client.
  • Exception handling telling people what to do if
    they have language issues, or disabilities, or
    need re-assessments etc
  • Other relevant stakeholders (e.g. line managers,
    HR staff) should be communicated with, regarding
    the procedures to be followed, and the nature of
    the candidate-client contract and other
    matters.

22
2c. Assessment delivery - execution
  • Stages that should be followed in carrying out
    the assessment
  • Conduct the assessments e.g. Applications,
    biographical data Tests (ability, achievement
    etc) Inventories Interviews Group exercises
    Simulations and work samples
  • Process data and interpret results (see slide 23)
  • Report and provide feedback (see slide 24)
  • Materials, data, and documentation should be
    filed in compliance with legal and contractual
    requirements throughout the whole process.
  • Documentation of procedures should be sufficient
    to deal with appeals, grievances, complaints etc.

23
2d. Assessment delivery process data and
interpret results
  • Procedures shall be in place to ensure that
    assessment data are accurately processed (e.g.
    see EFPA test user standards, performance
    criteria 3.3)
  • Interpretation of results should
  • Be consistent with the information available
    about the assessment instrument or procedure
    documented in technical and user manuals and
    other scientific literature
  • Take account of the context of the assessment and
    the assessment need as described in the contract.
  • Where more than one assessment method or
    procedure has been carried out, a final overall
    interpretation of the data should be generated
    for each case (e.g. person, team or
    organization).

24
2e. Assessment delivery report and provide
feedback
  • The people who make ultimate use of the
    information collected during the assessment
    should be provided with reports of the assessment
    that support that use.
  • The results should be reported in a form that is
    meaningful to the end user and will not result in
    over or under utilisation of the data. End users
    can include people, such as line managers, who
    make hiring decisions or have responsibility for
    staff development.
  • Feedback of results to the people who have been
    assessed should be provided if agreed in the
    assessment contract and if notified to them in
    the consent process.
  • Feedback of results should be accurate and
    relevant and be provided to recipients in a
    meaningful form
  • Feedback of results should be made with due
    consideration of the possible impact they might
    have.

25
3 Assessment Process - Review and follow-up
  • The client and the contractor should have a
    postassessment review
  • What went well what did not
  • Lessons learned
  • Specifically, client and contractor should
    review, in relation to Equitable Assessment
    criteria
  • Quality of assessment instruments and procedures
  • Quality of performance of assessors, raters and
    other assessment delivery personnel.
  • Procedure for integration of assessment data for
    use in decision process
  • Appropriateness of weight given to assessment
    data in decision process
  • Level of understanding of results by end users
    and implications for improving quality of
    reports.
  • Clients should be encouraged to carry out a
    longer-term follow-up
  • Review of criterion measures
  • Review of outcomes.
  • Review of consequences and impacts of the
    assessment intervention
  • Clients should be encouraged to participate in
    studies involving collection of data for future
    research, instrument revision, norming, etc

26
Guidelines considered
  • See separate document for list of references.

27
Glossary to be included
28
Notes
29
EA Criterion 5 Acceptability and Justice
  • Justice must not only be done, it must also be
    seen to be done
  • Gillilands notions of Procedural Justice (the
    process was fair) and Distributive Justice (the
    outcome was fair).
  • Research on applicant reactions need to show
    that applicant reactions matter (Anderson et al
    2004).
  • Perceptions of equity are determined largely by
    the quality of the information candidates are
    given about what they are being asked to do, why,
    and how their results are to be used.
  • Acceptability has more to do with how tests are
    used than with whether they are used.

30
EA Criterion 6 Practicality
  • How do we trade off costs and practicality issues
    for relevance and fairness?
  • Low cost simple sift vs high-cost Assessment
    Centre.
  • Can we measure the gain in equity in relation to
    cost and set a standard for what it is
    reasonable to expect people to pay?
  • E.g. Use of the 4/5th rule for defining adverse
    impact
  • Have to keep in mind the importance of balancing
    all the considerations and not be driven by
    short-term factors and cost alone.

31
Equity is a two-way street
  • Test takers have responsibilities as well as
    rights.
  • ITC Guidelines on Test Use address this
  • Not only do job applicants, candidates etc expect
    their assessors to treat them equitably, they are
    also expected
  • To be honest in their disclosure
  • Not to cheat the justice system
  • SLAs and Honesty contracts provide an explicit
    psychological contract between assessor and
    candidate in recruitment and selection testing.

32
Example of an Honesty Contract
  • I understand that the results of this assessment
    may be used in the process of deciding whether my
    application will be progressed further.
  • I understand that if it is progressed, I may be
    required to complete a similar assessment again
    under supervised conditions.
  • I confirm that I will endeavour to complete the
    assessment in accordance with the instructions
    given.
  • I will do so on my own, without seeking or
    accepting assistance or support from any other
    person or persons.
  • I undertake neither to make copies of any of the
    questions that will appear on screen during this
    assessment nor to pass on to any other person any
    information about the content of the assessment.
  • Click to accept
Write a Comment
User Comments (0)
About PowerShow.com