Introduction & Background – UK Libraries SCRC General Overview
UK Libraries’ Special Collections Research Center (SCRC) serves as the largest repository of archival and manuscript collections, personal and family papers, oral histories, primary sources, and rare printed materials in Kentucky, centrally located within the Commonwealth’s flagship public university.
The SCRC preserves and sustains collective memory, serving as an indispensable connection between past, present, and future, by collecting and documenting the social, cultural, economic, and political history of the Commonwealth of Kentucky. Our materials are used across the globe by scholars who contribute to original research and creative undertakings in all aspects of cultural life and work.
We are home to UK Libraries’ collection of rare books, Kentuckiana, the King Library Press, and the recently established John G. Heyburn Center for Judicial Excellence. The Louie B. Nunn Center for Oral History leads the nation in innovative approaches to capturing and sharing first hand experiences, and the internationally recognized Wendell H. Ford Public Policy Research Center and the Bert T. Combs Appalachian Collection hold prominence with scholarly researchers across the United States. The SCRC maintains the University Archives and serves as its archival repository for permanent records. [1]
Additionally, we advance and support the research, teaching, and scholarship of the University through a strong instruction program across an interdisciplinary curriculum, the Learning Lab undergraduate internship program, as well as advanced practicum and graduate assistantships. The SCRC is also home to two online digital libraries of digitized and born-digital content.
The Breckinridge Research Room (BRR) provides researchers in-person access to materials held at SCRC, and our team of staff members coordinate with each other as content experts to serve visiting researchers, answer remote research inquires, provide reproductions, and advise on publication permissions and copyright where applicable. We also provide access to collections and research services through a wide variety of online resources.
Prior services structure – Research Room, Remote Research Inquiries, and Reference
Over the past 5-7 years, the SCRC has undergone a substantial transition from a partially siloed organizational structure to a more integrated and team-based environment. This transition brought changes in staffing, service philosophy, and technology implementation to our reference services, which significantly altered the ways we track and monitor material use and coordinate access with the university and public. This article explores the evolution of that transition in SCRC’s research services and details how we leveraged technology to establish order, improve efficiency, align with university strategic goals, and simplify workflows, as well as communication, within an increasingly complex service model.
For about a decade, beginning in 2006, the research room was overseen with a philosophy of openness, deferential service, and a highly professional but relaxed approach. The Special Collections Librarian served as a director on the division’s leadership team and supervised two support staff who rotated time at the research room desk, among other duties. Student assistants, called “pages,” were employed to retrieve and reshelve materials, and were often left alone in the research room as monitors. Without prior special collections specific experience, the support staff and students allowed the research room to function more akin to a public library setting than a secure, policy-governed space. The paper-based use documentation system that existed prior to 2006 remained intact. Visiting researchers generally signed in and out of a register book and 5 different types of paper slips were filled in by staff and placed on materials, or on empty shelves, to indicate who was using what and where items belonged. During some years, visitors completed registration forms, in other years the practice seemed to have fallen by the wayside. Between 2006 and 2013, this system was sustained in some degree of isolation.
Meanwhile, reference and the answering of inquiries received from researchers remotely (by phone, mail, or email) evolved to be handled by a wider group of content and format experts across special collections. Initial contacts and answers to basic questions were provided by the two support staff, but more complex questions, or those that related to holdings in specific subject and content areas, were increasingly answered by faculty archivists and professional staff who worked most closely with those collections. For a number of years, an email listserv that included as many as 8-10 members served as a means of sharing information and coordinating answers to inquiries. For a time, the accumulated emails assisted with repeat questions and the creation of a knowledge base which the support staff could utilize for future inquiries.
Transition in Management, Staffing, Service Model, Systems, and Philosophy (2013-present)
2013 to 2016 brought many changes to the research room and reference service model. In the spring of 2013, the Special Collections Librarian took a six-month sabbatical. During this time, other faculty and professional staff previously only on the reference listserv, began taking turns at the research room desk, where it became clear that the existing paper system did not provide adequate, detailed information or effective data gathering regarding the use of materials, anything beyond the tallying of directional assistance and reference transactions, or user demographics in addition to the daily count of visitors. Tick sheets, paper transaction forms, and the overall counting of reference transactions were kept by each staff member using their own preferred method of recording. Although a concerted effort was made to implement changes using Excel spreadsheets, databases, and other technological tools, these systems did not persist after the end of the sabbatical term. Yet, they provided a glimpse of future possibilities and spurred interest across Special Collections in reorganization and new technological solutions.
Concomitantly, in the fall of 2013, a robust and deliberate undergraduate-focused outreach and education program launched, with the hiring of a full-time Education and Outreach Archivist, and a full-time Learning Lab Manager. These new programs, building on previous success and growth in these areas and overall campus initiatives in student success, targeted outreach to new faculty, instructors, teaching assistants, undergraduate researchers, and sought to build relationships with new majors and disciplines that had not traditionally visited the SCRC. Within less than two years of this shift in program focus, the number of class sessions focused on archives instruction had more than doubled.
By fall 2014 and spring 2015, 8-9 different people were staffing the research room desk in rotation, and as many as 25-30 classes had materials on hold for students returning to use them to complete assignments at some point during the semester, many of them needing reference assistance. It was clear that the multiple paper forms could not keep up with the changes in staffing or increase in users, becoming ever more unwieldy, impractical, and too easily scattered or altogether forgotten.

In March 2015, the Special Collections Librarian who had been at the helm of the Breckinridge Research Room since 2006, retired. In April 2015, the position was reconfigured into a new Director of Research Services and Education, thus bringing the education program into close alliance and alignment with research services. Division-wide hierarchical changes and flattening led to collections management and processing also joining with research services and education. As the overall dedication of SCRC team members to service and access further coalesced, especially around furthering the university’s strategic goals, the moniker of “BRR Team” helped to define the change and establish that an entire group of professionals across all SCRC work areas would provide reference and uphold the policies, procedures, professional standards, and best practices in the research room and through online communications. Team discussions around new technological means for organizing, triaging, and tracking reference services commenced.
Since the SCRC had seen an increase in the number of visiting researchers, many of whom were undergraduates using special collections for the first time, the BRR team members needed to become even more proactive in their interactions with researchers, to a degree far and away beyond previous years. These new users also had expectations related to ease of finding and requesting materials on their laptops or phones. In order to address these evolving expectations, the first major technological decision made was to implement Aeon for materials management. The implementation launched in July 2015, continuing through early 2016, with the system going live on March 31, 2016. At the same time, the overall UK Libraries technical services division decided to move away from the existing Voyager ILS to ALMA and PRIMO, while a brand new website based on Bootstrap was in development – all of these service were set to launch between January and March of 2016. The SCRC had already migrated from Archivist’s Toolkit to ArchivesSpace in September 2015, while the Interlibrary Loan Department in the main library was already using ILLiad. The UK Libraries maintains its own internal IT separate from the main campus IT, which provides a level of independence, and assisted with all of these transitions.
While Aeon provided a means for tracking materials, items for classes, reproductions, and researcher visits and use data, it did not address any of the problems specifically related to reference and the email listserv. By 2016, over 17 team members were part of the listserv, and it had become virtually impossible to keep up with all the legitimate questions, on top of an increasing amount of junk mail and spam. Team members were inundated with email correspondence from patrons, replies from other team members, or internal notes to say “I’ve got this question!” or “Thanks!” A brief six-month study of communications on the listserv revealed that approximately 60% of messages were spam, and an additional 20 to 30% were internal communications between staff, most of which were completely irrelevant to others on the listserv.
This was all further complicated by the fact that there was not just the one primary reference listserv to monitor, but two additional listservs for questions coming directly from the digital libraries. Furthermore, the overlap between those serving at the desk and answering remote reference had grown, necessitating greater coordination of efforts and communication beyond what the email listservs could ever provide. The setup was confusing, intimidating, and left little room for effective and meaningful communication, statistics-gathering, monitoring, or accountability. The SCRC now needed to replace this system with one that allowed us to communicate with one another, as well as patrons, more effectively and efficiently, track progress, report results, and serve as a companion to Aeon. In the spring of 2016, the SCRC reconfigured staffing to allow for a professional manager position to oversee all operations related to public services, reference, and access. The first assignment for the new manager was to work collaboratively to move away from the listservs to a web form, with the goal of having a new system in operation approximately 6 months after Aeon implementation.
Developing a Solution
Because the sheer volume of information coming through the listserv channels was such a glaring issue, planning for an alternative began around one central question: how do we create a structure that allows team members to see only as much information as they want or need to see? The information moving across the listserv was useful but unmanageable. Team members had to monitor the list at all times, even when only a few questions actually required their input. While the initial thought of implementing an online form system to ease the communication burden appeared promising, that alone would only address patron-to-staff contact, not the interdepartmental correspondence that accounted for a great deal of the total information traffic. In order to be of real use to both patrons and staff, any online form would have to function toward establishing some control on the staff as well as patron side of the communication equation.
One way in which an online form could work toward this end would be for it to function as a triage point to provide initial question categorization and sorting. Furthermore, as long as the form included some level of conditional logic functionality, we could cut down on some of the incoming traffic by immediately redirecting patrons for questions that did not require staff assistance, such as locating the Aeon registration page or the staff directory, based on their form answers. Although this would ease some of the volume, a secondary triage point would still be needed for all other questions requiring staff assistance. The issue there, though, is that if this point were managed by a staff member who routed inquiries to the appropriate team member, it would severely undermine the one inarguable positive effect of the listserv structure: open team collaboration. While the level of communication that took place on the listserv was excessive, it had, by its nature, the effect of encouraging staff collaboration. Because team members were required to keep a vigilant eye on every communication thread, a staff member would often offer up information about a question outside that staff member’s usual content or format scope. If, as in this two-point triage scenario, questions were routed directly from a point person to a single member of a team, those collaborative moments would be bypassed entirely. Any triage solution would have to balance information management and communication control while retaining the serendipitous knowledge-sharing that strengthened the quality of SCRC’s public services.
Even putting aside those concerns, this triage set-up would leave our need for efficient statistics-gathering unaddressed. In our prior system, all team members self-reported their reference numbers, which meant attempting to pull from the communication threads who completed what questions and when. If three people chimed in on a single question, did all three report or did just the person who closed out the transaction? Furthermore, simply reporting transaction numbers meant we missed out on a lot of potential for data collection. While we could produce some idea of the number of reference questions we completed, where did those questions originate? Did we primarily serve students, faculty and staff, or community members? What was our typical turnaround time in completing a transaction? Without some hefty text mining and a lot of educated guesswork, there was really no way to glean this information. In order to better tailor our services to our entire audience and user population, as well as accurately reporting the full extent of our work across subject and content areas, we needed a system that allowed us the ability to parse usable reference data from our transactions.
Having utilized online task management applications for software implementation, grant project management, and patron-driven digitization services in the past, it occurred to us that such a system could be used to solve our particular reference woes. Task management applications allow one to break projects down into individual tasks that can be passed among staff members and monitored for status as work progresses. What were reference transactions if not discrete tasks often requiring input from various team members over time? Moving away from a listserv and into a task management platform could potentially address our core needs in communication, accountability, and reporting in a single space. Asana, an online task management application, seemed to cover all of these needs, plus allowed team members to add tasks to the system via email. Using this functionality, we could conceivably link an online form to our task management system by way of emailed form responses, thus providing an automated bridge between our external communication and internal processes.
Linking External Communication to Internal Workflows
With that in mind, we built an online contact form in Jotform, a customizable online form application, which sends form submissions to a generic SCRC email account in Microsoft Outlook. Using a custom Outlook mail rule, those responses are forwarded directly to Asana, where they are converted into individual tasks. A custom subject line set up in Jotform becomes the task’s title, which includes an automatically generated reference number for tracking responses as well as general information about the research question pulled from the form’s submission fields (e.g. “REF_162 – Manuscripts (letters, diaries, etc.) research help”). This provides not only a unique identifier for each reference transaction but allows staff to quickly scan the task titles in Asana to determine if any questions fall into their respective content or format area.

The form itself is structured to gather useful statistical data, redirect researchers to immediately available information when possible, and provide general question metadata for use in quickly assigning tasks to appropriate staff members once those questions make their way to Asana. Jotform contains a robust conditional logic functionality, which lets us redirect users to specific pages or static information depending on the answers they give while filling out the form. This functionality serves as our first, patron-initiated triage. For instance, if a researcher needs to request materials for retrieval, they choose that option and are immediately redirected to our Aeon registration page without the need for staff intervention.
Another component in the conditional logic also allows us to accommodate external patron requests as well as internal staff referrals from this single form. The first form question asks the user to identify their affiliation with the University of Kentucky, which, depending on the answer, will direct them to either the outside researcher or internal library staff section. If SCRC staff or UK library staff from other branches receive reference requests, those requests are submitted via the form and include patron contact, question content, and point of contact. Staff then select whether they have already answered the question, if they plan to answer the question, or they need to refer the question to someone else. Throughout the process, both staff and external researchers provide information on format types, question topics, and any applicable deadlines.
A Task Management Approach to Research and Reference
Once in Asana, tasks are assigned either to oneself, to a colleague, or by a supervisor. Once assigned, the assignee receives an email notification with a link to the corresponding question. Asana uses a heart icon similar to social media sites like Twitter to indicate a “like” for a task. For our purposes, we use the icon to signal that a question is actively being worked on. In our previous reference structure, staff and supervisors spent a lot of time double-checking the status of a question to confirm not only that it had been seen by a particular staff member, but that work was actually in progress. The heart icon serves as an easy visual cue to others that a reference transaction is moving forward. Especially helpful for supervisors, it allows one to assign tasks and, when needed, quickly scan for the heart icon later to ensure that the assignments were received and are progressing, without having to constantly check-in on the transaction status.

Each task contains a comments thread, which consolidates what was once a string of disparate listserv replies. Team members are brought into a conversation by typing “@” followed by the team member’s name, after which they receive a notification that their input is needed. Keeping with the original intention to only give staff as much information as they need or want to see, this tagging brings staff into a conversation only when needed and gives them the freedom to leave the conversation once they have contributed their part. However, a team member does not have to be assigned to a task in order to see all of the tasks available. For team members who want to review all incoming reference transactions to determine where they can contribute, each question is available in an open list. Team members can also add themselves as a “follower” of a task thread in order to get regular updates on the progress of a transaction. As with the assignment notifications, followers get notified when new comments are added to the thread as well as when a task is marked complete. In addition to tagging colleagues or adding followers to a thread, Asana also allows subject tagging, although we do not currently use this functionality.
When a reference question is finished, staff click the task’s green check mark to complete the task. Because conversations with patrons take place outside of the Asana thread, staff are asked to add any pertinent correspondence or notes to the final comment thread before closing out a transaction. The purpose of this is twofold: one, if the transaction is ever reopened, staff will have access to the transaction history; and, two, it provides a searchable knowledge base of past research transactions within Asana. Asana projects are keyword-searchable, so if a patron refers to a past interaction or if the same reference question is asked later by different patron, staff can easily search through the backlog of reference transactions as needed.
Implementation and Results
We developed the initial idea for the task management structure through trial, test, review, and consultation. We first built the form using Google Forms, which allowed for some logical conditioning and directly pushed responses to Asana without having to route through a generic email system. However, it did not allow the level of flexibility we wanted in setting conditions and, perhaps more importantly, Google Forms did not come with captcha spam prevention. Given that one of our goals was to cut down on the amount of spam vetting, it was imperative to find a solution with a security measure baked in. Although we eventually rebuilt the form in Jotform, the first Google Forms-to-Asana set-up provided us a proof of concept. We presented this test version to our team as well as the UK Libraries’ Director of Web Development, whose support was especially important considering the form would eventually live on the SCRC website.
A soft rollout began in November 2016 with January 2017 considered the official starting point for statistics purposes. By January, the system was more-or-less in its current iteration and mostly free of extraneous testing data. We moved our team off of the listserv completely and all researchers are now redirected to use the contact form page. Overall, the response from staff has been positive and the reference process, from beginning-to-end, has improved in all areas we targeted: communication, statistics-gathering, monitoring, and accountability. Those team members who were overwhelmed by the influx of information on the prior listserv have appreciated getting notified only when necessary. For those who liked to look over every question as it came in, they have similarly appreciated that Asana retains the ability to openly view all transactions. It is far easier to pick up on a question already in progress, review the conversation thread, and respond appropriately.
Statistics-gathering has been consolidated so that all reference data is compiled from an Asana export and is reported by the Research Services Archivist in a detailed Qualtrics survey that the Associate Dean utilizes to gather ARL statistics. These statistics allow us to monitor and evaluate day-to-day tasks, behaviors, and communications needed to establish a data-driven decision-making model utilizing thorough and meaningful internal assessment of services. Although our sample data is still relatively small, we are already getting a clearer picture of our average monthly reference workload, question origin, and turnaround time.
The success of any implementation largely depends on buy-in from colleagues and administration. Luckily, the SCRC was ready to try something new and eager to test an alternative approach. The organizational culture is not only open and collaborative but supportive of experimentation, all of which was necessary for the success of this project. Early on, it occurred to us that Asana’s “assign a task or pass it on” may not work as well in a more hierarchically rigid organization. Whether due to our team approach to reference, which involves staff at all levels of the organization, or the realities of having to navigate the prior listserv structure, or even the concerted efforts to flatten the organizational hierarchy, team members were already comfortable passing off questions to their colleagues or supervisors. Team members understand that task assignment functions more as a “heads-up” rather than a direct call to complete a task. In initial training, we stressed that being assigned a task did not obligate you to complete it and that at any time you could assign it to another colleague. This served as an important step toward keeping the workflow fluid and the communication collegial, as well as a fine example of how technology form can follow, as well as bolster and improve, organizational function.
Because a gap always exists between what first goes on the whiteboard and what happens in production, it remains important to take an “always in beta” approach to project implementation. Team members were made aware from the beginning that what we were attempting was experimental and that there would not only be bumps in the road, but that implementers needed honest feedback to find and correct kinks in the system. The flexibility of our set-up allowed us to immediately respond to any issues that arose once we began using the form in regular practice. With specific regard to Jotform, its embedded nature allowed us to instantaneously make and roll out changes in response to questions and suggestions, without having to put ourselves in an IT queue. During the soft rollout period, we kept the listservs available in case what we set up was not working and we needed to backtrack and explore other options. Although never needed, knowing that we could easily default to our prior system gave us the needed security to run with the experiment.
Lingering Questions and Future Options
Despite the successful rollout, there are of course still ways to improve the system and a few lingering unresolved questions. For instance, inquiries from the digital libraries currently bypass the contact page and are directed into the Asana project alongside our regular contact form requests. While the consolidated data stream is an improvement, we do not get researcher demographic data from these two sources at the same level of granularity as what we get from our form data. In February and March of 2017 the digital libraries accounted for around 25% of our reference inquiries, therefore any patron demographic findings would have to be reconciled with that noticeable absence. While we are already gaining greater statistical insight than we were prior to implementation, it remains an area we need to not only further refine, but also explore regarding how we might use our data to more directly drive decision-making. We have closely followed the development of the SAA-ACRL/RBMS Joint Task Force on the Development of Standardized Statistical Measures for Public Services in Archival Repositories and Special Collections Libraries and plan to align future data gathering methods with these standards. There are also plans to better utilize Jotform’s analytics functionality, which is currently in beta, as well as user experience testing to gain insight into who exactly uses our online services, how they are using those services, and how we can best meet their needs.
Our most pressing concerns going forward entail long-term data management and team scalability. The free version of Asana, unfortunately, does not allow filtered data export. Every month, all cumulative data is exported as a CSV and any prior data is deleted manually. While this has not yet been an issue, if we continue with our current transaction average this could potentially present an unwieldy data set export in the future. Simply deleting completed transactions could be an option, but it would mean wiping out the searchable knowledge base, a major benefit of the system. Ideally, we would like to preserve that knowledge base but strip it of patron personal data and limit the exports to our current reporting month. As for scalability, Asana’s free version caps team size at 15 and we are already hitting that limit. Similar concern exists with the free version of Jotform, which caps form submissions at 100 per month. Paid versions of Asana and Jotform would provide unlimited team and submission caps as well as additional services such as task dependency, more analytics tools, and advanced search.
Luckily, because of the newly found simplicity and flexibility of our streamlined research services workflows, we are better able to more effectively anticipate and respond to needs, problems, and changes in our research services model. Although the complexity of the services we provide has increased, the labor and time once required to perform routine tasks has decreased dramatically, allowing us the space to experiment and refine our strategies and workflows toward achieving ever greater efficiency and providing better public service.
Notes:
This work is licensed under a Creative Commons Attribution 4.0 International License.