Demonstrating Library Value: Practical Approaches to Effective and Sustainable Library Assessment - PowerPoint PPT Presentation

About This Presentation
Title:

Demonstrating Library Value: Practical Approaches to Effective and Sustainable Library Assessment

Description:

Assessment may take place at the library level as well as at the broader community level. ... broader patterns, themes, consistency but not generalizeable to ... – PowerPoint PPT presentation

Number of Views:196
Avg rating:3.0/5.0
Slides: 67
Provided by: dph589
Learn more at: https://www.libqual.org
Category:

less

Transcript and Presenter's Notes

Title: Demonstrating Library Value: Practical Approaches to Effective and Sustainable Library Assessment


1
Demonstrating Library Value PracticalApproaches
to Effective and Sustainable Library Assessment
  • Steve Hiller
  • Director, Assessment and Planning
  • University of Washington Libraries, Seattle USA
  • Martha Kyrillidou
  • Director, Statistics and Service Quality Programs
  • Association of Research Libraries
  • Washington D.C.
  • Haifa University 2 April 2008

2
Free speech wall, Charlottesville, Sept 2006
3
Clear and present danger
  • Bangor University considers removing librarians
    posted by Blake on Thursday January 27, 2005
    _at_0730AM -753 hits       Ms Information
    writes "News from the University of Wales Bangor
    in the UK. senior management no longer feel that
    subject librarians / academic liaison librarians
    are needed in the modern academic library. They
    have made restructuring proposals which include
    removing all but one of the subject librarians
    and a tier of the library management, including
    the Head of Bibliographic Services. The
    university management thinks that technology has
    'deskilled' literature searching. As far as I
    know, this proposal is unprecedented in the
    United Kingdom. In essence, there will remain 4
    professional librarians serving a 'research-led'
    university of 8,000 plus FTEs and with 8 library
    sites. These will be the university librarian,
    cataloguing librarian, acquisitions librarian and
    Law librarian.
  • Has anything like this happened anywhere that you
    know of? If so, what have been the effects?

4
Assessment Librarian
5
Whats Driving the Assessment Agenda
  • Environmental Changes
  • Exploding growth in use and applications of
    technology
  • Increased customer expectations for services,
    including quality and responsiveness
  • Competition from other sources
  • Budgetary Constraints
  • Justification for spending on libraries
  • Increasing competition for resources
  • Cost savings and/or reallocation
  • Justifying our Existence
  • Fighting to remain relevant

6
The Challenge for Libraries
  • Traditional statistics/measures dont demonstrate
    the Librarys value
  • Emphasize inputs, expenditures, acquisitions,
    holdings how big and how much
  • Can no longer tell the librarys story
  • May not be aligned with library/organizational
    goals and plans
  • Dont define and measure service quality
  • Need metrics describing outcomes success and
    value from the users perspective
  • Provide the organizational culture, structure,
    tools, skill sets needed for libraries to
    acquire use this information
  • What difference do we make to our communities?

7
The Role of Library Assessment
  • Library assessment provides a structured process
    to learn about our communities, their work and
    the libraries connection to what they do
  • The information acquired through library
    assessment is used in an iterative manner to
    improve library programs and services and make
    our libraries responsive to the needs of our
    communities.
  • Libraries do not exist in a vacuum but are part
    of a larger organization and community.
    Assessment may take place at the library level as
    well as at the broader community level.

8
Library Assessment Provides Information That Can
Be Used for
  • Accountability and justification
  • Measure progress/achievement
  • Improvement of services
  • Comparison with others
  • Identify changing use and needs patterns
  • Remote/In-person
  • Relevancy of current services and resources
  • New services
  • Marketing and promotion
  • Telling our story
  • Decisions based on data, not assumptions
  • Assumicide!

9
Customer-Centered Library and the Culture of
Assessment
  • Customer-Centered Library
  • All services and activities are viewed through
    the eyes of the customers
  • Customers determine quality
  • Library services and resources add value to the
    customer
  • Culture of Assessment
  • Organizational environment in which decisions are
    based on facts, research and analysis,
  • Services are planned and delivered to maximize
    positive customer outcomes

Its about making our academic community
successful

10
What Do We Need to Know About Our Customers?
  • Who are our customers (and potential customers)?
  • What do they do? How is information used in
    their work?
  • How do they work? Whats important to them?
  • How do libraries support work, learning, leisure
    and life?
  • How do they use library services? What would they
    change?
  • How do they differ from each other in library
    use/needs?
  • How does the library add value to their work or
    needs?
  • How does the library contribute to their success?

11
Understanding Our Communities is Critical
  • Your communities need library services which are
    embedded in their
  • Workflows
  • Learnflows
  • Leisureflows
  • Lifeflows
  • Engage with their languages and their processes
    of learning, research, leisure and life
  • Learn whats important to them
  • Because they may not come to you
  • Support learning, research and life where they
    occur and surface that support within their
    lifeflows

12
How Do We Get Customer Information?
  • Surveys
  • Usage statistics
  • Focus groups
  • Observation
  • Usability
  • Interviews
  • Embedding
  • Data mining (local, institutional)
  • Logged activities
  • Comments, suggestions, over the counter

13
Which Services Do We Assess
  • Services that are strategically and operationally
    important
  • Services that are critical to the success of the
    library AND the community
  • Services that are mandated for review
  • High performing, heavily used services
  • Recognize their success ensure sufficient
    resources
  • High performing, but underused services
  • Is marketing needed?
  • Is the service outmoded?
  • Poorly performing services
  • Can they be fixed? Should they be discontinued?
  • New services

14
Choosing the Assessment Method
  • Define the question and need before you start?
  • Is there existing information you can use?
  • Is the method appropriate?
  • Is it timely?
  • Is it cost effective?
  • What expertise is needed to conduct and analyze?
  • How will the results be used?
  • Are you assessing whats important?
  • Are you assessing whats important to your
    community?

15
Using Existing Information
  • Community/institutional data sources
  • Previous assessment efforts (library/organization)
  • Library use data (including e-metrics)
  • Acquisition requests and interlibrary loan data
  • Computer/Web log data
  • Comparative or trend data from other sources

16
User Surveys
  • Is what I think what is really widely believe?
    Generalizability
  • Reaching large numbers of people
  • Comparing groups and subgroups
  • Measuring underlying concepts in a reliable and
    valid way
  • Recognized as a legitimate way of collecting
    information
  • Survey population factors
  • Sample size, representativeness, response rate,
    survey fatigue
  • Types transaction-based, print-mailed, web,
    phone
  • Expertise needed for design, analysis and
    interpretation
  • Questions often asked from our perspective
    language
  • Recognize the value of your respondents time

17
Why would I (or you) respond to a survey?
  • Reasons for Survey Response
  • Civic duty
  • Personal connection
  • Authority
  • Public/social good
  • Self-interest
  • Reciprocation
  • Incentives
  • Feel strongly about . . .

18
Qualitative Approaches
  • Increasing use of such qualitative methods as
    comments interviews, focus groups, usability,
    observation
  • Statistics often cant tell us
  • Who, how, why
  • Value, impact, outcomes
  • Qualitative provides information directly from
    users
  • Their language
  • Their issues
  • Their work
  • Qualitative provides context and understanding

19
Observational Studies
  • Describe user activities in terms of
  • what they do
  • how they do it
  • how much time they take
  • problems they encounter
  • Can be obtrusive or unobtrusive
  • Can be tied in with interviews or usability
  • Well-developed data collection method/protocol
    essential
  • Room counts/facilities use most common
  • Quick and inexpensive can use sampling

20
Interviews and Focus Groups
  • Increasingly used for understanding user needs,
    work, behavior and outcomes
  • Purpose defined questions should be well-thought
    out
  • High degree of customer involvement
  • Clarify and add context to previously identified
    issues
  • Customer defined language and issues
  • Objective and effective interviewer/facilitator
    needed
  • Can identify broader patterns, themes,
    consistency but not generalizeable to broader
    population
  • Interview/focus group themes can be followed up
    with other methods or they can add context to
    other data

21
Focus Groups
  • Structured discussion to obtain user perceptions
    and observations on a topic
  • Usually composed of 6-10 participants and may be
    repeated several times with different groups
  • Participants within group should be similar
  • Facilitator or moderator guides discussion
  • Participants encouraged to share perspectives
  • Participants learn from each other

22
Usability
  • User-centered design
  • Apply both to virtual (Web) and physical space
    (wayfinding)
  • Asks users to demonstrate
  • How easy is it to navigate
  • How easy is it to understand
  • How easy is it to find
  • How easy is it to perform
  • Overall how easy is it to use
  • Iterative process

23
Comments
  • Unsolicited
  • Unstructured (open suggestion boxes)
  • Prompted on specific issues (tell us what you
    think about . . . )
  • Part of another assessment tool (e.g. open-ended
    questions/comments on survey)
  • Comments are valuable for identifying issues of
    concern, and can add richness and texture. They
    generally need to be corroborated by other
    information

24
Once You Have Data . . . .
  • but to suppose that the facts, once established
    in all their fullness, will speak for
    themselves is an illusion.
  • Carl Becker
  • Annual Address of the President of the American
    Historical Association, 1931
  • . . . Comes the difficult parts
  • Understanding your data
  • Communicating results
  • Using the results for improvement

25
Use Results Wisely
  • Understand your results
  • Know the limitations of your data
  • Use appropriate analysis methods and tools
  • Comparative data provide context and
    understanding
  • Seek internal or external validation
  • Identify what is important and why

26
  • Mission
  • Shaping the future of research libraries in the
    changing environment of public policy and
    scholarly communication.
  • Members
  • 123 major research libraries in North America.
  • Ratios
  • 4 percent of the higher education institutions
    providing 40 percent of the information
    resources.
  • Users
  • Three million students and faculty served.
  • Expenditures
  • 40 percent is invested in access to electronic
    resources.

27
Assessment at ARL
  • A gateway to assessment tools StatsQUAL
  • ARL Statistics -- E-Metrics
  • LibQUAL
  • DigiQUAL
  • MINES for Libraries
  • ClimateQUAL
  • Building the library assessment community
  • Library Assessment Conferences
  • Service Quality Evaluation Academy
  • Library Assessment Blog
  • Library Assessment Consulting Service
  • Effective, Sustainable, Practical Library
    Assessment

28
LibQUAL
  • Multidimensional user satisfaction survey
  • Focuses on the users point of view (outcomes)
  • Highlights difference between users perceived
    level of service and their minimum/desirable
    levels
  • Requires limited local survey expertise and
    resources
  • Tested and validated at more than 1000
    institutions on 5 continents and in 19 languages
  • Analysis available at local, national and
    inter-institutional levels
  • Offers opportunities for highlighting and
    improving your status within the institution

29
Survey Composition
  • 22 Core Questions
  • Affect of Service
  • Information Control
  • Library as Place
  • 5 Local Questions (optional)
  • 5 Information Literacy Questions
  • 3 General Satisfaction Questions
  • Library Usage Patterns
  • Demographics
  • Free Text Comments Box

30
Survey Structure
31
Zone of Tolerance
32
Visual Representation of Data
33
Building Assessment Capability in Libraries
through Consultation Services
  • ARL project Making Library Assessment Work
    approved in 2004
  • Funded by participating libraries
  • Conducted by Steve Hiller and Jim Self 1.5 day
    site visit
  • Presentation and best practices
  • Interviews and meetings
  • Report to the library with recommendations
  • 24 libraries in U.S. and Canada visited in
    2005-06
  • Succeeded by Effective, Sustainable and Practical
    Library Assessment in 2007 (open to all
    libraries)
  • 15 libraries participating in 2007-08
  • Including 4 outside North America

34
What We Found Organizational Development
Structure Issues are Critical
  • Strong interest in using assessment to improve
    customer service and demonstrate value of library
  • Many libraries uncertain on how to establish,
    maintain, and sustain effective assessment need
    assessment skills
  • Effectiveness of assessment program not dependent
    on library size or budget
  • Each library has a unique culture and mission. No
    one size fits all approach works.
  • Strong customer-focus and leadership support are
    keys to developing effective and sustainable
    assessment

35
Effective Assessment
  • Is customer-focused
  • Is aligned with library and organizational goals
  • Assesses what is important
  • Develops criteria for success
  • Uses appropriate and multiple assessment methods
  • Uses corroboration from other sources
  • Provides results that can be used

36
Sustainable Assessment needs . .
  • Organizational leadership
  • Sufficient resources
  • Supportive organizational culture
  • Identifiable organizational responsibility
  • Connection to strategic planning and library
    priorities
  • Iterative process of data collection, analysis,
    and use
  • Involvement of customers, staff and stakeholders

37
Practical Assessment
  • Keep it simple and focused less is more
  • Know when enough is enough
  • Use assessment that adds value for customers
  • Present results that are understandable
  • Organization is prepared to act on results

38
A Strategic Framework for Assessment The Balanced
Scorecard at the University of Virginia Library
  • Provides performance measurement indicators
    (measures), based on library objectives
  • Reflects the organization's mission and
    strategies
  • Evaluates current performance and potential for
    the future

39
Core Questions
  • User Perspective
  • How well is the library meeting user needs?
  • Internal Processes
  • Do the librarys processes function efficiently?
  • Finance
  • How well are the librarys finances managed?
  • Learning and Growth
  • Is the library well positioned for the future?

40
Choosing the Metrics
  • Reflecting Values
  • What is important?
  • What are we trying to accomplish?
  • Diversity and Balance
  • Innovations and operations
  • Variety of measurements
  • Ensuring validity
  • Does the measurement accurately reflect the
    reality?
  • Being Practical
  • Use existing measures when possible
  • Use sampling
  • Collect data centrally
  • Minimize work by front line

41
Balanced Scorecard Examples
  • User Perspective
  • Overall rating in student and faculty surveys
  • Target1 An average score of at least 4.25 (out
    of 5.00) from each of the major constituencies
  • Target2 A score of at least 4.00
  • Internal Processes
  • Processing time for routine acquisitions
  • Target1 Process 90 of in-print books from North
    America within one month.
  • Target2 Process 80 of in-print books from North
    America within one month

42
Balanced Scorecard Examples
  • Finance
  • Unit Cost (cost per download) of Electronic
    Serial Use
  • Target1 There should be no increase in unit cost
    each year.
  • Target2 Less than 5 annual increase in unit
    cost.
  • Learning
  • Comparing librarian salaries to peer groups
  • Target1 Average librarian salaries should rank
    in the top 40 of average salaries at ARL
    libraries.
  • Target2 Rank in top 50.

43
University of Washington(Site of the 2008
Library Assessment Conference!)
  • Located in beautiful Seattle metro population
    3.2 million
  • Comprehensive public research university
  • 27,000 undergraduate students
  • 12,000 graduate and professional students (80
    doctoral programs)
  • 4,000 research and teaching faculty
  • 800 million annually in federal research funds
    (2nd in U.S.)
  • Large research library system
  • 40 million annual budget
  • 150 librarians on 3 campuses

44
UW Libraries Assessment PrioritiesCustomer
Needs, Use Patterns, and Success
  • Information seeking behavior and use
  • Patterns of library use
  • Value of library
  • User needs
  • Library contribution to customer success
  • User satisfaction with services, collections,
    overall
  • Data to make informed and wise decisions that
    lead to resources and services that contribute to
    user success

45
UW Libraries Assessment Methods Used
  • Large scale user surveys every 3 years
    (triennial survey) 1992, 1995, 1998, 2001,
    2004, 2007
  • All faculty
  • Samples of undergraduate and graduate students
  • Research scientists, Health Sciences
    fellow/residents 2004-
  • In-library use surveys every 3 years beginning
    1993
  • Focus groups/Interviews
  • Observation (guided and non-obtrusive)
  • Usability
  • Use statistics/data mining
  • Information about assessment program available
    at
  • http//www.lib.washington.edu/assessment/

46
Case Study UW Libraries Biosciences Review
  • Reasons for review
  • Better understand how bioscientists work
  • Growing inter/multi/trans disciplinary work
  • Significant change in use patterns
  • Libraries responsiveness to these changes
  • Value of research enterprise to the University
  • Strengthening library connection to research
  • Making good decisions on services and resources
  • Ensuring our services and resources support the
    work of the UW biosciences community

47
Biosciences Review Process (2006)
  • Define scope (e.g. what is bioscience?)
  • Identify and mine existing data sources
  • Extensive library assessment data
  • Institutional and external data
  • Acquire new information through a
    customer-centered qualitative approach
  • Environmental scan
  • Interviews (faculty)
  • Focus groups (faculty, grad students, undergrads)
  • Peer library surveys

48
Biosciences Interview Focus Group Questions
  • Interviews
  • Describe your work
  • What type of information do you need and how do
    you find it?
  • What barriers do you encounter?
  • What role does the library play in connecting you
    to information
  • What else could the library do to support your
    work
  • Focus Groups
  • How do you find the information you need?
  • What barriers do you encounter?
  • What other services would you like to have
    available?
  • What services in the library do you like or
    appreciate?

49
Biosciences Interview/Focus Group Themes
  • Content is primary link to the library
  • Identify library with ejournals want more titles
    backfiles
  • Provide library-related services and resources in
    our space not yours
  • Discovery begins primarily outside of library
    space with Google and Pub Med Web of Science
    also important
  • Library services/tools seen as overly complex and
    fragmented
  • Print is dead, really dead
  • If not online want digital delivery/too many
    libraries
  • Go to physical library only as last resort
  • Difficulty coming up with new library services
  • Graduate students interested in data management
    services

50
Biosciences Task Force Recommendations
  • Integrate search/discovery tools into users
    workflow
  • Expand/improve information/service delivery
    options
  • Make physical libraries more inviting/easier to
    use
  • Consolidate libraries, collections and service
    points
  • Reduce print holdings focus on services
  • Use an integrated approach to collection
    allocations
  • Get librarians to work outside library space
  • Lead/partner in scholarly communications
    E-science
  • Provide more targeted communication and marketing

51
Biosciences Review Follow-up 2007 Actions
  • Appointed a Director, Cyberinfrastructure
    Initiatives Special Asst to the Univ Libr for
    Biosciences E-Science
  • Meet with interested parties on campus
  • Helped shape Libraries Strategic Plan Priority
    Initiatives
  • Improve discovery to delivery (WorldCat Local
    etc.)
  • Reshape our physical facilities as discovery and
    learning centers
  • Strengthen existing delivery services, both
    physical and digital, while developing new, more
    rapid delivery services
  • Enhance and strengthen the Libraries support for
    UWs scientific research infrastructure
  • Do market research before developing promoting
    services
  • Informed development of Libraries 2007 Triennial
    Survey

52
Confirming Findings
  • Did themes raised in the interviews/focus groups
    reflect the bioscience population? The campus
    community? The 2007 Triennial Survey as
    corroborating source
  • Related Questions
  • Mode of access (in-person, remote)
  • Resource type importance
  • Sources consulted for research
  • Primary reasons for using Libraries Web sites
  • Libraries contribution to work and academic
    success
  • Useful library services (new and/or expanded)

53
UW Triennial Library Survey Number of
Respondents and Response Rate 1992-2007 Large
number of respondents allows for analysis within
groups
2007 2004 2001 1998 1995 1992
Faculty 1455 36 1560 40 1345 36 1503 40 1359 31 1108 28
Grad/Prof Students 580 33 627 40 597 40 457 46 409 41 560 56
Undergrads 467 20 502 25 497 25 787 39 463 23 407 41
54

55
(No Transcript)
56
Frequency of Off-Campus Remote Use (
connecting at least 2x week)
57
Primary Reasons for Faculty Use of Libraries Web
Sites (at least 2x per week)
58
Sources Consulted for Information on Research
Topics (Scale of 1 Not at All to 5 Usually)
59
Importance of Books, Journals, Databasesby
Faculty Academic Area
60
Libraries Contribution to (Scale of 1 Minor to
5 Major)

61
Usefulness of New/Expanded Services Faculty and
Grad
62
Survey Follow-Up Actions
  • Probe deeper on specific library contributions to
    research and student academic success using
    qualitative methods
  • Nursing students focus groups beginning Spring
    2008
  • Review scope and effectiveness of information
    literacy programs
  • Develop plan to deliver print content to
    faculty grad students in their format of choice
    and in their space
  • Pilot test scan on demand began January 2008
  • Strengthen our subject librarian liaison efforts
    to better understand and support research in
    their areas
  • Develop standardized toolkit for assessing
    library connection to research enterprise.
    Revisit scholarly communications policy
  • Integrate library services resources into user
    workflows

63
How UW Libraries Has Used Assessment
  • Extend hours in Undergraduate Library (24/5.5)
  • Create more diversified student learning spaces
  • Eliminate print copies of journals
  • Enhance usability of discovery tools and website
  • Provide standardized service training for all
    staff
  • Stop activities that do not add value
  • Consolidate and merge branch libraries
  • Change/reallocate collections budget
  • Change/reallocate staffing
  • Support budget requests to University

64
Overall Satisfaction by Group 1995-2007
65
Assessment can
  • Support decision making
  • Establish accountability
  • Strengthen organizational mission, goals, and
    objectives
  • Realign library goals and mission statements with
    the needs of users
  • Identify trends in user behaviors and technology
  • Measure learning outcomes and impact
  • Improve programs and services
  • Help us know our customers better

66
Thank you
  • Steve Hiller
  • hiller_at_u.washington.edu
  • Martha Kyrillidou
  • martha_at_arl.org
Write a Comment
User Comments (0)
About PowerShow.com