Title: Making Incremental Improvements to Public Library Comparative Statistical Practices Ray Lyons Jason Holmes Library Assessment Conference Seattle, Washington August 5, 2008
1Making Incremental Improvements to Public
Library Comparative Statistical PracticesRay
LyonsJason HolmesLibrary Assessment
ConferenceSeattle, Washington August 5, 2008
2The Basic Problem
- The ultimate goal of the library is public
enlightenment. - It is difficult to assess our impact on
enlightenment of the community because we have no
way to measure it. - We measure what we CAN measure
- We compare what we CAN compare
3Context of Comparative Statistics
- Assessment
- Measures used as part of a more general process
of assessing library value and effectiveness - Management practice
- Measures intended for iterative and ongoing
process of performance measurement
4Public Library Assessment
- Library profession traditionally applies systems
or industrial model - Inputs resources supplied to the library
- Outputs products and services
Throughput
Input
Output
5Performance Measurement Model
RESULTS
EFFORTS
Intermediate Outcomes
End Outcomes
Inputs
Outputs
Outcome Measures
Outcome Measures
6Performance Measurement Steps
- Define long term goals (desired outcomes)
- Define medium and short term objectives
- Develop programs aimed at objectives
- Specify measurement indicators
- Monitor indicators to track accomplishments
7Rationale for Standardized Statistics
- PLA Planning-for-Results approach to library
management - Abandonment of established operational and
performance standards - ALA / PLA 1987 publication, Output Measures for
Public Libraries A Manual of Standardized
Procedures, defines standard statistics and
collection procedures
8ALA / PLA Approach to Standardized Statistics
- Useful for self-evaluation based on service
response choices library makes - Should be interpreted with respect to library
mission, goals, objectives - Interpretation left up to the library
9Current Practices in Comparative Library
Statistics
- How are library statistics currently used for
comparing public libraries? - What are the bases for these uses?
- What purposes do they serve?
10Survey of Ohio Public Libraries on Use of
Comparative Statistical Measures
- Exploratory study
- Available at http//www.plstatreports.com/c
ompare/ - UsePerceptComparStats.pdf
- Responses via interview or online questionnaire
- Stratified random sample of 90 Ohio public
libraries - Two strata urban rural counties
- Response rate 47 (42 libraries)
11Survey Findings
12Survey Findings
13Survey Findings Frequency of Managerial Review
of Input Measures
Frequency Input Measure
Annually Quarterly Monthly Weekly
Rarely Not Sure
Operating expenditures 12 9 65 15 0 0
Print Mat. Expenditures 12 18 62 9 0 0
Electronic Mat. Expenditures 39 21 36 3 0 0
Print Materials 56 9 29 3 3 0
Print Subscriptions 74 12 15 0 0 0
Audio/Video Materials 58 9 30 0 3 0
Databases 63 14 6 6 6 6
Internet terminals 51 11 14 9 6 9
FTE Staff 79 6 12 0 3 0
NOTE Sky blue highlighting indicates measures
that 50 or more library managerial teams review
periodically. Light blue highlighting indicates
higher frequencies that, combined, total 50 or
more.
14Survey FindingsFrequency of Managerial Review
of Output Measures
Frequency Output Measure
Annually Quarterly Monthly Weekly
Rarely Not Sure
Circulation 9 0 83 9 0 0
In-house Mat. Use 21 12 26 3 34 0
Interlibrary loan 23 6 66 6 0 0
Visits 21 15 54 6 3 0
Reference Transactions 32 24 34 0 9 0
Program attendance 32 9 51 6 0 0
Electronic Mat. Use 19 6 60 0 9 0
Internet Terminal Use 15 6 66 3 3 3
Website Use 12 9 63 0 9 3
NOTE Sky blue highlighting indicates measures
that 50 or more library managerial teams review
periodically. Light blue highlighting indicates
higher frequencies that, combined, total 50 or
more.
15Survey Findings
16Survey Findings
17Survey Findings
18Survey Findings Statistical Measures Libraries
Usein Comparisons with Other Libraries (Table
Format)
Measure of
Libraries
Using Measure
Measure of
Libraries
Using Measure
Material Expenditures 100.0
Circulation 96.8
Operating Expenditures 90.3
FTE Staff 77.4
Print Material Counts 48.4
Audio/Video Material Counts 41.9
Databases Available 41.9
Visits 38.7
Subscriptions 35.5
Interlibrary Loans 35.5
Electronic Materials Expenditures 32.3
Librarians 32.3
Program attendance 29.0
Internet terminals 25.8
Reference Transactions 25.8
Electronic Materials Usage 12.9
Other (borrowers, salaries, etc.) 12.9
In-house Material Usage 9.7
Internet Terminal Usage 9.7
Website Usage 9.7
19- Interpreting Library Measures
There are no right or wrong scores on an
output measure high and low values are
relative. The scores must be interpreted in terms
of library goals, scores on other measures, and a
broad range of other factors. - Van
House, Weill, and McClure (1990)
20Interpreting Library Measures
- ALA / PLA policy since 1980s
- Leave data interpretation
- to local library
- Each library staff should decide for
them-selves whether the statistical findings
for that library were acceptable in terms of
performance expectations. - - Ellen Altman (1990) describing the Public
Library Performance Measurement Study by Deprospo
et al. (1973)
21Key Problems with Library Statistics
- Lack of criteria for evaluating measures
- Collection of standard statistics assumes all
library resources/activities counted to be
equivalent
22Key Problems with Library Statistics
- Standardization ignores differences in
- Complexity - Sophistication - Relevance -
Quality (Merit)
- Value (Worth) - Effectiveness - Efficiency -
Significance
23Key Problems with Library Statistics
- Data imprecision due to
- Inconsistent collection methods
- Mistakes
- Sampling error
- Gaming
- Statistical imputation
- Imprecision makes individual library comparisons
less valid
24Key Problems with Library Statistics
- Lack of reliable methods for identifying peer
libraries - Comparisons are either approximate or inaccurate
- Can result in incorrect or misleading conclusions
- Data are self-reported and unaudited
25Key Problems with Library Statistics
- The More-is-Better Myth
- Views higher numbers as favorable performance,
lower as unfavorable - More activity does not necessarily mean better
activity - - Van House, Weill, and McClure (1990)
- Striving to earn higher numbers may compromise
service quality
26Key Problems with Library Statistics
- Statistics say nothing about performance
adequacy, quality, effectiveness, or efficiency
of library resources/activities - No consensus on constructs that statistics can
realistically reflect - Difficult to determine remedies for problems
which statistics might reveal
27Key Problems with Library Statistics
- Variety of reasons for insufficient scores
- Inadequate knowledge of community needs -
Staff skill deficiencies - Inadequate staffing
- - Inefficient workflows
- Inadequate planning
-
- - Limited user
- competencies
- . . . and others
Adapted from Poll and te Boekhorst (2007)
28-
- Output measures reflect the interaction of
users and library resources, constrained by the
environment in which they operate. The meaning of
a specific score on any measure depends on a
broad range of factors including the librarys
goals, the current circumstances of the library
and its environment, the users, the manner in
which the measure was constructed, and how the
data were collected. emphasis added - - Van House, Weill, and McClure
(1990)
29Policy-Level Problems withLibrary Statistics
- PLA managing-for-results approach has produced
undesirable results - Confusion about meanings of statistical
indicators - Expectations that local libraries are able to
interpret data productively have been too
optimistic
30Policy-Level Problems withLibrary Statistics
- Exaggerated or inaccurate advocacy campaigns
undermine credibility of assessment process,
methods, and data - Biased advocacy studies at cross-purposes with
need for accountability
31Negative Side-Effects of Library Advocacy
Practices
- Advocacy narratives can dumb down data analysis
and assessment efforts - Encourage absurd interpretations of library
statistics - Misuse key assessment terms and concepts
- Promote unjustifiable conclusions drawn from
studies that have employed limited research
methods
32Improvements Needed
- Commit to ensuring credibility of assessment data
and performance measurement findings - Discourage naïve, disingenuous, and unsupportable
use of statistics or assessment findings - Specify professions ideology regarding rules of
evidence
33Improvements Needed
- Fuller understanding of limitations of
statistical indicators and comparison methods - Discourage describing performance solely based on
standard statistics - The measures are best used with other
information about the library. - - Van House,
Weill, and McClure (1990)
34Improvements Needed
- Relate levels of resources/services to levels of
community needs - Explore relationships among entire set
- of standard statistical indicators, i.e.
complementary and conflicting dimensions
35Improvements Needed
- Identify peer libraries using multiple
indicators - Community population library budget
- key demographic characteristics
- Explore feasibility of alternative sets of
indicators depending on library type, size,
mission, etc.
36Improvements Needed
- Increased understanding of measurement and
interpretation - Draw reasonable conclusions and interpretations
- Basic behavioral science measurement practices
37Behavioral Science Measurement Model
Conceptualization
Nominal Definition
Operational Definition
Measurement in Real World
Babbie (2007)
38References
Ellen Altman. Reflections on Performance
Measures Fifteen Years Later In Library
Performance, Accountability, and Responsiveness
Essays in honor of Ernest R. DeProspo, C.C.
Curran and F.W. Summers, eds. (Norwood, NJ
Ablex, 1990) Earl Babbie, The Practice of
Social Research, 11th ed. (Beaumont, California
Thomson, 2007) Roswitha Poll and Peter te
Boekhorst. Measuring Quality Performance
Measurement in Libraries, 2nd ed. (Munich KF
Saur, 2007) Nancy A. Van House et al. Output
Measures for Public Libraries A Manual of
Standardized Procedures, 2nd ed. (Chicago
American Library Association, 1987) Nancy A.
Van House, Beth T. Weill, and Charles R. McClure,
Library Performance A Practical Approach
(Chicago American Library Association, 1990)