Library Services Assessment - PowerPoint PPT Presentation

About This Presentation
Title:

Library Services Assessment

Description:

To understand user interaction with library resources and services; and ... Survey Results: Top 5 planned assessment studies. User satisfaction survey / LibQUAL ... – PowerPoint PPT presentation

Number of Views:201
Avg rating:3.0/5.0
Slides: 63
Provided by: juliem80
Category:

less

Transcript and Presenter's Notes

Title: Library Services Assessment


1
Library Services Assessment
  • Isla Jordan, Carleton University
  • Julie McKenna, University of Regina
  • February 2, 2007
  • OLA Super Conference 2007 Session 1408

2
Outline
  1. Definition and Purpose
  2. Survey of Assessment Practices
  3. Types of Assessment
  4. Benchmarks, Standards and EBL
  5. Drivers of Assessment
  6. Tools and Techniques
  7. Assessment strategy
  8. Questions

3
Assessment
  • a critical tool for understanding library
    customers and offering services, spaces,
    collections, and tools that best meet their
    needs. Without good assessment, libraries could
    lose touch with users desires and needs and even
    become irrelevant.
  • Nardini (2001)

4
Assessment
  • any activities that seek to measure the
    librarys impact on teaching, learning and
    research as well as initiatives that seek to
    identify user needs or gauge user satisfaction or
    perceptions with the overall goal being the
    data-based and user-centered continuous
    improvement of our collections and services.
  • Pam Ryan,
  • libraryassessment.info

5
The purpose of assessmentin libraries
  • To understand user interaction with library
    resources and services and
  • To capture data that inform the planning,
    management and implementation of library
    resources and services.
  • Bertot, 2004

6
Survey of Assessment Practices in Canadian
University Libraries
  • Winter 2007

7
Survey of Assessment Practices - Purpose
  1. Benchmark services assessment practice
  2. Capture some measures about the culture of
    assessment

8
Survey Sections
  • Demographic Information
  • Assessment Planning
  • Involvement in Assessment in Organization
  • Collection and Use of Data to Inform
    Decision-Making
  • Final Comments

9
Survey Participants
  • Invitation to complete a web-based survey to all
    University Librarians of
  • Council of Prairie and Pacific University
    Libraries (COPPUL)
  • Ontario Council of University Libraries (OCUL)
  • Council of Atlantic University Libraries
    (CAUL/AUBO)
  • Invitation (February 12, 2007) to complete a
    French edition of the web-based survey
  • members of Conférence des recteurs et des
    principaux des universités du Québec (CREPUQ)

10
Survey of Assessment Practices
  • English Survey
  • 60 invitations 39 respondents
  • 65 response rate
  • French Survey
  • To launch February 12, 2007

11
Thank you to
  • University of Toronto
  • UWO
  • Queens University
  • McMaster
  • University of Windsor
  • York University
  • Guelph University
  • Nipissing University
  • University of Waterloo
  • Carleton University
  • Brock University
  • Memorial University
  • University of Saskatchewan
  • UBC
  • University of Alberta
  • And many more.

12
Types of Assessment
  1. Input Output
  2. Service Quality
  3. Performance Measures
  4. Outcomes or Impact

13
1. Input Output
  • Input measures expenditures resources
  • Funding allocations, of registered students,
    print holdings, etc.
  • Output measures activities service traffic
  • Reference transactions, lending and borrowing
    transactions, of instruction sessions, program
    attendance, etc.
  • Ratios
  • Students/librarians, print volume
    holdings/student, reference transactions/student,
    etc.

14
Survey Results how output data is used
  • Decision-making
  • Hours
  • Staffing scheduling
  • Service points
  • Collection decisions
  • Type of data
  • Gate count
  • Body counts
  • Reference transactions
  • Circulation statistics

15
2. Service Quality
  • Services defined as all programs, activities,
    facilities, events,
  • Measure capture results from interactions with
    services
  • Subjective evaluation of customer service
  • Measure of the affective relationship

16
The only criteria that count in evaluating
service quality are defined by customers. Only
customers judge quality all other judgments
are essentially irrelevant.
  • (Zeithaml, Parasuraman and Berry 1990)

17
LibQUAL
  • Association of Research Libraries
  • Standard for service quality assessment (2003)
  • Total market survey
  • Based in Gap Analysis Theory
  • User perceptions and expectations of services
  • Measures outcomes and impacts

18
3. Performance Measures
  • Involves the use of efficiency and effectiveness
    measures
  • Availability of resources
  • Usability of programs, resources and services
  • Web page analysis
  • Content analysis
  • Functionality analysis
  • Cost analysis

19
4. Outcomes or Impacts
  • the ways in which library users are changed as
    a result of their interaction with the Library's
    resources and programs

Association of College Research Libraries Task
Force on Academic Library Outcomes Assessment
Report, 1998
20
Examples
  • The electronic journals were used by 65 scholars
    in the successful pursuit of a total of 1.7
    million in research grants in 2004.
  • In a 2003 study, eighty-five percent of new
    faculty reported that library collections were a
    key factor in their recruitment.

21
LibQUAL Measures Outcomes
  • The library helps me stay abreast of developments
    in my field(s) of interest.
  • The library aids my advancement in my academic
    discipline.
  • The library enables me to be more efficient in my
    academic pursuits.
  • The library helps me distinguish between
    trustworthy and untrustworthy information.

22
Benchmarks, standards and EBL
  • Standards Measures that tie the value of
    libraries more closely to the benefits they
    create for their users
  • NISO 2001 (National Information Standards
    Organization)
  • Benchmarking improving ourselves by learning
    from others (UK Public Sector Benchmarking
    Service)

23
Benchmarks, standards and EBL
  • EBL (Evidence Based Librarianship) attempts to
    integrate user reported, practitioner-observed
    and research-derived evidence as an explicit
    basis for decision-making.
  • (Booth, Counting What Counts 2006)

24
Example of a Standard
  • Example Information Literacy Standards for
    Science and Engineering Technology (ACRL 2006)
  • Standard 1 The information literate student
    determines the nature and extent of the
    information needed.
  • Performance Indicator 3 The information
    literate student has a working knowledge of the
    literature of the field and how it is produced.
  • Outcome a ... student knows how scientific,
    technical, and related information is formally
    and informally produced, organized, and
    disseminated.

25
CACUL Standards Committee
  • Goals
  • Add Canadian context to existing standards in
    college and university libraries, e.g. ACRL
  • prepare report for CACUL AGM at CLA 2007
  • form new team in summer 2007
  • contact Jennifer Soutter jsoutter_at_uwindsor.ca

26
Survey Results Drivers of Assessment
  • University Library Administration 92
  • Need for evidence to inform planning 87
  • University Administration 62
  • CARL, ARL or regional lib. Consortium 54

27
Multiple Methods of Listening to Customers
  • Transactional surveys
  • Mystery shopping
  • New, declining, and lost-customer surveys
  • Focus group interviews
  • Customer advisory panels
  • Service reviews
  • Customer complaint, comment, and inquiry capture
  • Total market surveys
  • Employee field reporting
  • Employee surveys
  • Service operating data capture

Note. A. Parasuraman. The SERVQUAL Model Its
Evolution And Current Status. (2000). Paper
presented at ARL Symposium on Measuring Service
Quality, Washington, D.C.
28
Canadian Adoption of LibQUAL Benefits
  • Quick, inexpensive
  • Standardized and tested instrument and practice
  • Data set of comparables for Canada
  • Insight into best practices at peer
    institutions
  • Build staff expertise and encourage evidence
    based practice and practitioners
  • Opportunity to introduce Canadian changes to
    instrument

29
User Surveys LibSAT, LibPAS
  • continuous customer feedback
  • LibSAT measures satisfaction
  • LibPAS (beta) measures performance
  • http//www.countingopinions.com/

30
Usability testing
  • gives user perspective
  • often for website design
  • e.g. user driven web portal design (U Toronto
    2006)
  • also for physical space
  • e.g. wayfinding in library http//www.arl.org/a
    rldocs/stats/statsevents/laconf/2006/Kress.ppt

31
Instruction Program Example -- Assessment Methods
  • Learning outcomes
  • Student performance on examinations, assignments
  • Pre- and post-test results
  • Level of "information literacy"
  • Program service measures (outputs)
  • of instruction sessions offered, requests for
    course specific support, of session attendees,
    by discipline, by faculty member, by course,
    logins to library-created online tutorials, of
    course pages created within universitys learning
    portal, etc.
  • Student course evaluations peer evaluations
  • Qualitative and quantitative
  • Service quality assessment
  • LibQUAL (gap between expectations and
    perceptions)

32
Examples
  • Use patterns
  • laptop loans, GIS over paper maps, eBooks
  • Space usage studies
  • e.g. Learning Commons study (University of
    Massachusetts Amherst)
  • Instruction and Information Literacy
  • e.g. use of online learning modules

33
Electronic resources assessment
  • statistics not being systematically captured for
    digital collections or services
  • need for standard measures for use of digital
    collections is increasingly important
  • to justify huge expenses of electronic
    collections
  • decline in use of traditional services
    (reference, ILL)

34
Electronic resources assessment
  • COUNTER Real-time acquisition of usage
    statistics
  • imports usage statistics from content vendors in
    a uniform format (COUNTER - Counting Online Usage
    of Networked Electronic Resources)
  • reduces need to retrieve statistical data on a
    resource-by-resource basis
  • can compare usage statistics with cost
    information to evaluate service benefits of
    e-resources

35
Electronic resources assessment
  • Output statistics for ScholarsPortal databases
    and e-journals, e.g.
  • the number of requests for articles
  • holdings of different aggregators, to see overlap
  • Web logs, to see patterns of use

36
Survey Results Statistics - electronic resources
37
Survey Results Electronic resources assessment
  • "we are gathering e-resources stats as part of an
    overall journal review "
  • The Library is currently reviewing Scholarly
    Statistics, a product designed to gather and
    present for analysis e-resource statistics. Also
    under consideration is an ERM which, along with
    its other capabilities, will provide statistic
    analysis.

38
Electronic resources assessment
  • I have been busy this week with the compilation
    of electronic journal usage statistics for ARL.
    To complete Section 15 (Number of successful
    full-text article requests) in the Supplementary
    Statistics section, I am limiting myself to
    Counter-compliant JR1 statistics provided by the
    publisher. Still, I am encountering unexpected
    complexities. .. The JR1 format is based on an
    the calendar year, but the ARL statistics are
    reported on the budget year. This means for every
    publisher I have to compile two years worth of
    data and manipulate it. http//www.libraryassess
    ment.info/

39
Surveys, Interviews, Focus Groups
  • Surveys
  • quick to implement, difficult to design
  • identify issues, pick up anomalies
  • wording is critical
  • test, test, test .
  • users over-surveyed
  • Interviews and focus groups
  • more scope for follow-up, explanation
  • subjective, time-consuming

40
Survey Results Top 5 planned assessment studies
  1. User satisfaction survey / LibQUAL
  2. Gate traffic study
  3. Electronic database use
  4. Electronic journal use
  5. Usability of the website

41
Survey Results Staff Abilities
  • Strengths
  • Formal presentations
  • Formal reports
  • Draw conclusions
  • Make recommendations
  • Project management
  • Facilitate focus groups
  • Weaknesses
  • Sampling
  • Research design
  • Focus group research
  • Survey design
  • Qualitative analysis

42
Challenges of assessment
  • Gathering meaningful data
  • Acquiring methodological skills
  • Managing assessment data
  • Organizing assessment as a core activity
  • Interpreting data within the context of user
    behaviours and constraints.
  • (Troll Covey, 2002)

43
Survey Results Where is assessment placed?
  • Assessment Librarian (2 institutions)
  • Assessment Coordinator
  • Libraries Assessment and Statistics Coordinator
  • Library Assessment and Information Technology
    Projects Coordinator
  • Librarian, Evaluation Analysis
  • Manager, Evaluation Analysis

44
Survey Results Who else is assigned assessment
responsibility?
  • distributed to all unit heads or team leaders (4)
  • AULs have responsibility (6)
  • UL or Director (3)
  • administrative or executive officer (4)
  • access services or circulation (3)
  • other positions (12)

45
Survey Results Committees
  • Assessment Committee
  • Priorities and Resources Committee
  • Statistics Committee
  • LibQual Committee
  • LibQUAL Working Group
  • Library Services Assessment Committee
  • Community Needs Assessment Committee
  • PR/Communications Committee
  • Accreditation Self-Study Steering Committee
  • Senior Management Group
  • Cooperative Planning Team

46
(No Transcript)
47
(No Transcript)
48
(No Transcript)
49
(No Transcript)
50
Services Assessment Strategy
  • The evaluation environment is increasingly
    complex, and requires knowledge of multiple
    evaluation frameworks, methodologies, data
    analysis techniques, and communication skills

Note. J.T. Snead et al. Developing Best-Fit
Evaluation Strategies. (2006). Paper presented at
Library Assessment Conference, Virginia.
51
Assessment Continuing Commitment
Research Question
Reporting
Methodology
Analysis
52
Services Assessment Strategy
  • Decide what you need to know and why
  • Assign priorities
  • Confirm timelines
  • Commit to and carry out methodologies for
    discovery
  • Analysis and reporting
  • Continuous assessment and reporting commitment

53
Culture of Assessment
  • is an organizational environment in which
    decisions are based on facts, research and
    analysis
  • where services are planned and delivered in ways
    that maximize positive outcomes and impacts for
    customers and stakeholders
  • exists in organizations where staff care to know
    what results they produce and how those results
    relate to customers expectations
  • organizational mission, values, structures, and
    systems support behavior that is performance and
    learning focused.
  • (Lakos, Phipps and Wilson, 1998-2002)

54
Resources
  • ARL
  • ARL New Measures website (background info)
  • Canadian LibQUAL consortium
  • summer 2007 workshop
  • Sam Kalb kalbs_at_post.queensu.ca
  • Service Quality Evaluation Academy (boot camp)

55
Resources
  • ARL (contd)
  • ARL visit Making Library Assessment Work
  • 11/2 day visit from Steve Hiller and Jim Self
  • pre-visit survey, presentation to staff,
    interviews, meetings, written report
  • UWO participated - for more information, contact
    Margaret Martin Gardiner mgardine_at_uwo.ca
  • 2006 Library Assessment Conference
    http//new.arl.org/stats/statsevents/laconf/index.
    shtml

56
Resources
  • Assessment blog
  • libraryassessment.info
  • Journals, conferences
  • Performance Measurement and Metrics
  • Evidence Based Library and Information Practice
  • Northumbria International Conference on
    Performance Measures

57
Resources
  • Books Papers
  • Blecic, D.D., Fiscella, J.B. and Wiberley,
    S.E.Jr. (2007) Measurement of Use of Electronic
    Resources Advances in Use Statistics and
    Innovations in Resource Functionality, College
    Research Libraries, 68 (1), 26-44.
  • Booth, A. (2006) Counting what counts
    performance measurement and evidence-based
    practice. Performance Measurement and Metrics, 7
    (2), 63-74
  • Brophy, P. (2006) Measuring Library Performance
    principles and techniques, London, Facet
    Publishing.

58
Resources
  • Books Papers
  • Bertot, J.C. et al. (2004) Functionality,
    usability, and accessibility Iterative
    user-centered evaluation strategies for digital
    libraries. Performance Measurement and Metrics, 7
    (1) 17-28.
  • Brekke, E. (1994) User surveys in ARL libraries.
    SPEC Kit 205, Chicago, American Library
    Association
  • Covey, D.T. (2002) Academic library assessment
    new duties and dilemmas, New Library World, 103
    (1175/1176), 156-164.

59
Resources
  • Books Papers
  • Lakos, A., Phipps, S. and Wilson, B. (1998-2000)
    Defining a Culture of Assessment.
    http//personal.anderson.ucla.edu/amos.lakos/Prese
    nt/North2001/Culture-of-Assessment-2001-4.pdf
  • Nardini, H.G. (2001) Building a Culture of
    Assessment, ARL Bimonthly Report, 218 (Oct 2001).
    http//www.arl.org/resources/pubs/br/index.shtml

60
Resources
  • Books Papers
  • Snead, J.T. et al. (2006) Developing Best-Fit
    Evaluation Strategies. Library Assessment
    Conference, Virginia. lthttp//www.arl.org/stats/st
    atsevents/laconf/06schedule.shtmlgt
  • Zeithaml, V.A., Parasuraman, A. and Berry, L.L.
    (1990) Delivering Quality Service balancing
    customer perceptions and expectations, London,
    Collier Macmillan.

61
Thank you!
  • Questions or comments are welcome

62
Contact us
  • Isla Jordan, Carleton University
  • Isla_Jordan_at_carleton.ca
  • Julie McKenna, University of Regina
  • Julie.McKenna_at_uregina.ca
Write a Comment
User Comments (0)
About PowerShow.com