The Antelope Valley College Library’s Transition to Desk Tracker

The Antelope Valley College Library’s Transition to Desk Tracker, by Scott Lee, Antelope Valley College

 

Before 2012, the AVC Library used the standard paper tic sheets to record reference transactions.  As others have demonstrated, this form of data collection provided little detail or depth about reference transactions and has always had questionable accuracy.  Having used it myself at two previous libraries, I had always questioned its value.

In 2011 we began to investigate transitioning to an electronic system of data collection to replace the tic sheets.  From my exploration, I was attracted to the Desk Tracker (DT) product from Compendium Library Services.  We had, at the urging of our dean at that time, looked at a system called S.A.R.S.  It was designed for counseling departments to schedule and record information about appointments, however, we found it to be significantly incompatible with the needs of a reference desk.  We were aware of homemade products that other CCL’s had developed, but found we either lacked the expertise and time to create one or that what was crated would not provide the level of depth we were looking for.  That is, ultimately, what drove us to select Desk Tracker.

Main Input Form

DeskTrackerForm

DT uses input forms on which you can put checkboxes or textboxes.  They can be optional or required and some checkboxes, if selected, can lead to follow-up forms with additional checkboxes or textboxes.  The first step was to decide what to put on our input forms.  To begin the process, we used a textbox form on Survey Monkey and asked reference librarians to write short narratives of their reference transactions.  I used this data, the categories that previously existed on our tic sheets, and discussions with full-time and adjunct librarians to develop the main input form (see image).

We fully committed to getting as much information out of DT as possible, so our form is somewhat extensive.  The form has a list of options to select from that identify what was done in the transaction.  Librarians can select more than one item as often multiple activities happen in a reference session.  We also ask what type of user was served (student, staff, etc.), how they contacted the librarian (in person, phone, email), and approximately how long it took to complete the transaction.  Length of time was something all of us felt were lacking in traditional reference data collection and knew it was important to have here.  We also decided to use follow-up forms for some important categories, such as Library Catalog or EDS (EBSCO Discovery Service) to get yet more data on how those tools are used in reference service.

Library Catalog Follow-up FormdeskTrackerFollowUp

In the spring semester of 2012, we introduced our in Desk Tracker forms.  However, we still used tic sheets to collect our official statistics and asked librarians to use DT in addition to the tic sheets as part of a breaking in period.  They were all told that DT would be the only form of data collection by the fall semester and the tic sheets would be done away with at that time.  But, we wanted them to have time to get familiar with the DT interface and system while still being able to use the tic sheets for actual data collection.  This was going to be a significant change to process for all reference librarians, and this new process was going to take more time and effort than normal.  I also sought input from librarians on how best to design the DT interface and as well as suggestions about the transition.

At the start of the fall 2012 semester we officially moved to DT for all reference transaction statistics.  As the data collection form was complex, we assured librarians that we were not expecting perfection at the start and they should not panic if they forget to enter data into the form.  The goal was to make sure they were comfortable with the change, but to also communicate that this is the new normal and they will have to adjust.  I also sat with some librarians during their desk time to talk with them about DT and how they were adapting to it.  Looking back, I would have made sure to sit with all librarians and not just a few.  I would have received better feedback and a few librarians who still had transitioning issues could have been helped earlier.

One significant advantage of DT is how easy it is to make changes to the input forms.  With this I can get feedback from librarians and implement those ideas, sometimes immediately.  In this first semester, I moved things around and added new follow-up forms based on feedback from librarians as well as analysis of some of the textbox data in DT.  Most of the forms have an “Other” textbox to collect information on activities for which there are no checkbox categories.  Looking through this helps me identify trends and patters than to need their own checkboxes.  Generally, if changes are recommended or identified in the first two or three weeks, I immediately implement them.  After that, I often wait until the following semester as I need to change the spreadsheets I use to collect, organize, and analyze the DT data.  These spreadsheet changes can be extensive and take quite a bit of time.  Once I make changes for a semester, I avoid doing them again until the next semester.

For analysis, I download DT data as a comma-delimited spreadsheet that I import into Microsoft Excel.  From there I import into SPSS, which is a professional-level data analysis tool.  SPSS is not something many CCL’s can afford to buy or have the expertise to use.  I was taught to use it as part of my doctoral studies and talked my college into purchasing a copy for me.  You do not need Excel or SPSS, however, to get statistical value out of DT.  Because I have the tool and the ability to use it, I want to get the maximum value out of the data that I can.

So, what have we learned from DT?  At this point we have five semesters worth of data (not including intersessions and summer sessions).  In some areas, we have learned nothing new, but have confirmed what had already been known from experience.  However, do not discount how important this is.  The experience and knowledge of educators carries much less weight as data has become the icon of the modern education religion.  Although we knew these things, now that we can demonstrate it with data, it has reality.

As an example, we can see that the number of questions peaks in the first week of the semester and then declines sharply by week two or three.  Then there is a much slower and steady decline through the end of the semester.  However, the average time per question is usually at its lowest in the first week and steadily increases with a tendency to peak in week nine.  We were aware that we had more questions in the first few weeks but spent more time in later weeks, but this has confirmed it for us.  As to why we have more questions at the start of the semester, we have a large collection of reserve textbooks and our students report delays of three to four weeks before getting financial aid money for textbooks.  As such, most of the questions we answer in the first two weeks are related to textbooks reserves.  By week three or four, students have their own textbooks and we are moving onto other topics.  This was more experiential knowledge that there is now data to support.  By tracking the number of questions by category across weeks, we can see that questions related to reserves are the most asked in week one and experience over 50% decreases in the next two weeks before virtually disappearing by week four.

Not everything we have seen has been just confirmation of what we already knew.  We use reference books more than we thought.  Some weeks, use of reference books represents ten percent of all questions for that week.  While this is certainly not a bulk of our questions, it was more than I, and others, expected.  Additionally, while library literature is full of discussions of how the reference desk mostly answers directional questions, our data has found that directional questions are only about 7% of the questions asked.  I believe part of the reason for that impression in the literature is because the use of tic sheets tends to over simplify question categories, so questions are categorized as directional that may not actually be.

One concern to be aware of, should you chose to make the transition we did, is that tic sheets can actually encourage over counting.  For example, if a student approached the reference desk seeking help with using your periodicals database and using the catalog and what time the library closes, with tic sheets, this would probably be counted as three transactions.  With DT, this would be counted as one reference transaction with multiple components.  As such, the raw numbers for our transactions plummeted severely.  Comparing spring 2009 to spring 2014, there was over a 50% drop in reference transactions.  This was quite scary as I was writing our fall 2014 program review report unsure of how to explain this situation.  However, I decided to count the multiple components of transactions, in addition to the transactions counts, and then compare that to the tic sheet numbers.  Doing this the new transactions and components counts were within 90% of the tic sheets counts.  This showed that we were seeing about the same number of transactions, but were counting them differently.  By the next program review we should only have Desk Tracker data so this conversion will not need to happen further.

At this point, we have had Desk Tracker for about three years. We have primarily used DT to analyze what we currently do and to provide data for institutional reports - such as program review and accreditation - and budget requests.  However, it is also useful for our decision-making processes.   The specificity and depth of our reference data collection is breathtaking compared to what we used to have and it makes the idea of going back to tic sheets feel like being partially blinded.