Evaluations of Programs and Services

Core Competency N: Evaluate programs and services based on measurable criteria.

Section 1. Interpretation of competency

The evaluation of library services and programs is useful as a means of driving the direction of what the library has to offer and how it is offered. It is important to know what expectations your users and stakeholders have; this is the best way to help ensure the continued success of the library.  However, if established criteria based are not used, you may as well be leaving things up to chance.

This does not mean that one rigid method should be used, but rather a holistic plan should be developed, one that is as customized for the library as other policies such as the collection development plan are. An assessment plan should include several components with data collection happening at multiple levels on a continuous basis. By taking incremental snapshots across the institution as a whole, it is possible to create a rich data set to which you can apply big data techniques to take advantage of heuristic methods in addition to other types of qualitative analysis.

Libraries are expected to report data to many different groups. For public libraries, this would include taxpayers and the government as well as any highly involved stakeholders such as a friends group. For a small special library or archives, the entities that provide financial support will be their largest stakeholder. For academic libraries, there are accreditation bodies, university administrators, plus faculty and students to be considered.

Collecting and then disseminating evaluation data can take many forms. Collection for public services might include monitoring foot traffic in the library, gathering statistics on circulation, interlibrary loan, requests to borrow special items such as museum passes, and attendance numbers from events. These numbers might be reviewed in conjunction with survey data, anecdotal evidence, and reports from professional societies to make informed choices about what to offer and what to change.

Once assessment data has been collected and reviewed internally, it is generally reported out. This can take many forms from highly detailed forms and instructions, to financial reports, to marketing communications. The overarching goal of these activities is twofold. One, to prove that the library is functioning at its expected level; that it provides the types of programming, resources, and services that are found at comparable institutions; and that its financial books are in order. Two, to find ways to improve what it offers in a way that is sustainable and logical.

Section 2. Reference to supporting evidence

Evidence One. Academic Assignment.

Legal Research Instruction Plan Proposal. Appendix B. Survey instrument designed to gauge relevancy and use of course materials six months after attending a class on legal research.

This survey was written as part of an assignment for the SJSU Instructional Design (INFO 250). The course was designed to teach librarians some basic legal research skills and acquaint them with legal reference materials. Legal research is different from other types of research and most academic law librarians are also law school graduates. This class was taught by law library faculty and was designed to address the fact that there are big gaps in knowledge in terms of what other types of librarians know about legal research. The course is run by professional law librarian association and this type of outreach is important for membership recruitment.

The survey was designed to assess a few things. It was designed to monitor and track who is taking the course. It does this on several levels, by library type, by experience level, and by role within the library. It looks at what types of questions the librarians are being asked the most and what types of materials they are using. All of these questions allows the curriculum to be better aligned with users needs and can also be used to inform other outreach activities including membership and additional programming.

Evidence Two. Professional Activity.

NHLA Roundtable Survey. Google spreadsheet with survey results from a series of professional librarian society roundtables.

The New Hampshire Library Association (NHLA) is a professional society dedicated to advancing the cause of libraries in the state of New Hampshire. They run a variety of professional activities from conferences to continuing education. They are always interested in developing new programming ideas for their members. This roundtable series was a new initiative designed to bring librarians from the same geographic region within the state together to collaborate on and discuss common issues and ideas that concerned their communities.

Four roundtables were conducted. Following the events, a survey was sent out to all attendees and 63 people responded. The questions were very open-ended and responses ranged from content-oriented feedback to logistical issues. It proved to be a very useful set of anecdotal data that provided  clues as to what the members of NHLA are thinking and would like to talk about. Surveys like these are very valuable for taking the pulse of any professional society.

Evidence Three. Work Experience.

Scholars’ Repository Quarterly Impact Report

Blog post with simple statistical reporting designed to engage faculty authors, members of the administration, and students with the institutional repository.

The point of gathering statistics is to be able to accurately report both internally to colleagues, staff and management as well as externally to clients and stakeholders. This blog post is an example of hybrid reporting. Although designed to engage an internal audience, the information also serves as good advertising to showcase scholarly communication efforts.

I feel this piece of evidence shows competency in this area because it illustrates my understanding of the role that established criteria play in the evaluation of a program or service. The end result can serve many purposes; in addition to formal reporting, it has other uses from advertising to goal setting.

Section 3. Application of competency

The ability to measure and quantify is all around us and can be adopted in ways that are simple and economical. Google analytics is a free tool that can be employed on websites; other free tools are available. The form plugin for WordPress that we use in my library reports on the conversion rate. That is the percentage of people who sign up for a given event after visiting the web page for it. The paid tools that I can use include survey instruments such as Qualtrics and the data visualization tool Tableau. I can also access include rich bibliographic metrics from our research information management system and commercial databases.

We are also in the process of reporting on our three-year results for our institutional open access policy. According to Kipphut-Smith, Boock, Chapman, and Hooper (2018), this is largely uncharted territory. They conducted a survey of libraries with OA policies and asked them to describe their results on compliance. The data showed that there was not only a clear lack of best practices around this type of reporting but that the field even lacked common definitions (abs.).

As we are writing our report, we are finding that the data looks different depending on how it is viewed. We are also finding that, due to the differences in the databases sources we rely on to supply information about our faculty publications, it is impossible to say with complete certainty the number of scholarly articles we have produced for a given time period. This is particularly true for the partial year that the policy was first implemented. The lessons learned from this experience I feel are not only to accept certain imperfections in reporting sometimes but also to think about the implications of timing and policy in regard to the evaluation of programs and services.

Section 4. Bibliography in APA format

Kipphut-Smith, S., Boock, M., Chapman, K. and Willi Hooper, M. (2018). Measuring open access policy compliance: Results of a survey. Journal of Librarianship and Scholarly Communication, 6(1), p.None. DOI: http://doi.org/10.7710/2162-3309.2247