Skip to Main Content

Library Liaison Toolkit

Assessing User Needs

How do we find out what our users need?

Keep in touch with local interests:

  • Review curriculum, syllabi, and assignments
  • Review research interests and publications of faculty
  • Review proposed thesis and dissertation topics

Keep in touch with developments in the discipline:

  • Monitor the major professional journals and discussion lists
  • Review professional conference agendas and proceedings

Evaluating the Collection

There are numerous ways to evaluate a collection. To be effective, there needs to be clarity about

  • the purpose of the evaluation: what will we do with it
  • the goals of the evaluation: what do we hope to find out
  • the scope of the evaluation: will it be comprehensive, or a sample
  • the limitations of available data and analysis tools

Many of the tools we use will output raw data that requires further manipulation with spreadsheets, database, or visualization software.

Types of analysis:

  • Usage
    We capture a variety of usage data, but it can be tricky to compile it all to get a comprehensive picture of how our collections are being used. Some use is not captured (e.g. browsing the stacks, or articles received from colleagues), but downloads, views, circulation statistics give some indication of interest.
    • Journal usage data
    • Database usage data
    • Linker click-through data
    • e-book usage data
    • See the guide to e-resource usage data for more details.
    • MERLIN circulation data.
      Circulation data is recorded in the MERLIN item records. Use Sierra to check circulation for individual items, or to create a list (review file) of records meeting certain criteria. Note: patron data is attached only while an item is checked out. It is possible to see what items are checked out to different patron types (faculty, undergraduate, etc.) but only while the material is checked out.
    • Cited sources
      Capturing the cited references from the works of MU-affiliated authors gives a snapshot of resources used. Web of Science and Scopus allow searching by affiliation and make it easy to download all the citations from the results, but neither is comprehensive.
       
  • Unfilled Needs
  • Expenditures/investment
    Expenditure analysis is often done to try to correlate investment to a subject area and may be requested as part of a program or accreditation review. Correlating expenditure to a subject area can be difficult, especially for the interdisciplinary package deals. The fund codes used on the MERLIN order records indicate academic departments and some interdisciplinary areas and may be used to get a general impression of the distribution of our funds.
    • Payment History lists individual payments by title, date, fund code.
    • Fund Reports show the overall balances, expenditures, encumbrances, allocations.
       
  • Overlap analysis is usually done in preparation for a weeding project or in determining whether to purchase or discontinue a database or journal package. 
    • Overlap among online resources.
      Use this form to request an overlap analysis of duplication among online journal packages or indexes.
    • Overlap with other institutions.
      • Overlap in print holdings among the MERLIN libraries can be seen by creating a list/review file in Sierra. Local item records are attached to common bibliographic records.
      • Overlap with other collections can be discovered using WorldCat (FirstSearch version) and searching by Library Code (near the bottom of the screen). Note: one institution may have several library codes. And many libraries have collections not reflected in WorldCat.
      • YBP consortial reports show the overlap in purchasing within GWLA and MOBIUS, but captures only materials purchased from YBP
         
  • Peer comparison 
    Peer comparison is often done in the context of a program or accreditation review. It is usually less granular than an overlap analysis, looking at the overall access to resources rather than at specific titles. Note that each academic department may identify its own peer group and may distinguish between actual and aspirational peers.
    • YBP provides a peer ranking report in GOBI, but it is based only on purchases made from YBP, so is limited.
    • WorldCat comparison. Comparison with other institutions can be done by searching WorldCat using the library codes of peer libraries. Note: one institution may have several library codes. And many libraries have collections not reflected in WorldCat. See WorldCat Tips.
       
  • Benchmarking may measure a library collection against some standard or goal. It can be useful for measuring incremental progress towards an ideal.
    • Benchmarking against core title lists. This simply involves identifying a list or bibliography as a standard and checking holdings against the list.
    • Benchmarking for collection goals. This involves identifying a goal, indicators of achievement, and periodically measuring the indicators. For example: a goal of building a useful collection might be measured by the percentage of recent acquisitions used. An increase in usage rates over time might indicate improvement in achieving the goal.
       
  • Age & Scope of the Collection
    This is a thorough review of a collection, sometimes done for accreditation or program review, or to articulate overall strengths or balance in the collection. Age may be measured by publication year or by date of acquisition. Scope may involve subject areas, languages, call number ranges, authors, publishers.
    • Sierra Create Lists allows the generation of lists based on multiple and varied criteria. Elements of the records can be output as a delimited file for further manipulation in spreadsheets or other database software.
    • The Sierra Statistical Categories Table (SCAT) aggregates data from a list or review file according to predetermined call number ranges (categories).