Start Submission Become a Reviewer

Reading: A Measure of Success? Utilizing Citation Analysis to Evaluate Consultation Strategies in Ora...

Download

A- A+
Alt. Display

Research

A Measure of Success? Utilizing Citation Analysis to Evaluate Consultation Strategies in Oral Communication Courses

Authors:

Cori L. Biddle ,

Penn State Altoona; Bridgewater College Library, US
X close

Vickie Montigaud-Green

Bridgewater College Library, US
X close

Abstract

This study sought to use citation analysis to gauge the effectiveness of library instruction, in the form of research consultation, for 12 sections of a 100 level Oral Communication course at a small private liberal arts institution. Librarians analyzed the source bibliographies of informative and persuasive speeches in order to determine if students were able to apply strategies from their informative speech consultation to later assignments. This article outlines the strategy used to analyze student sources, issues encountered with citation analysis as a research strategy, and impacts of librarian consultation and faculty influence on the sources that students chose to cite in assignments.

How to Cite: Biddle, C.L. and Montigaud-Green, V., 2020. A Measure of Success? Utilizing Citation Analysis to Evaluate Consultation Strategies in Oral Communication Courses. Virginia Libraries, 64(1), p.3. DOI: http://doi.org/10.21061/valib.v64i1.597
53
Views
8
Downloads
  Published on 08 Jun 2020
 Accepted on 12 Mar 2019            Submitted on 12 Jul 2018

Introduction

When the librarian leaves a class after a one-shot session, or the student leaves a librarian’s office after a research consultation, often the librarian’s interaction with that student has ended. Very rarely is there any feedback as to the result of that session, and whether the student left with any information literacy skills or just a list of sources. Such questions are important to librarians and their instruction objectives, but the opportunities to meaningfully assess student research and information literacy skills can be few and far between.

Information literacy instruction can be piece-meal on a campus, especially when it is only encouraged, but not required, in many foundational courses. Starting in 2012 though, the librarians in this study were able to build a relationship with three of the faculty who taught a majority of the 100 level Oral Communications sections. The course is typically completed during a student’s first year, but can contain a variety of class levels. By 2014, a one-shot model had evolved to a hybrid one-shot/consultation model where the faculty and one or two librarians met with individual students or small groups of students during their class period for 10–20-minute consultations. A week of the class schedule in each section was set aside for these meetings.

Though the librarians in this study had used surveys in the past to assess student satisfaction with instruction sessions, the surveys often gave an incomplete picture of student engagement and the transference of information literacy skills. The librarians felt that it would be worth the extra effort to evaluate their work with the Oral Communications courses and the hybrid one-shot/research consultation model they developed, through an analysis of student bibliographies. Though each faculty member personalized the consultation structure, the assignment criteria remained consistent across all sections. Since the consultations happen at the same time of the semester for each section and can take up a majority of the librarians’ week (28 class sessions in Fall of 2016), it was important to understand whether this was time well spent. The faculty and librarians wanted to see whether they could find quantitative data, beyond student satisfaction surveys, to indicate how helpful the model was for students.

In Fall of 2016, with IRB (Institutional Review Board) approval, the librarians undertook a citation analysis project with 12 sections of the Oral Communications course. The project aimed to answer three questions: 1) How successful were the instruction interactions with students, 2) Did students retain information literacy skills throughout the rest of the course, and 3) Is there one method of instruction that worked better for Oral Communication courses overall. In addition, a student satisfaction survey was administered to gauge student reaction to the library instruction.

Literature Review

Citation Analysis

There are few ways for academic librarians to measure the success of their interactions with students. They are typically able to record student satisfaction with library one-shot instruction sessions, research consultations, or reference desk visits with surveys or anecdotes; however, the fact that students register interactions as valuable does not always indicate that they have gained knowledge that can translate to information literacy skills. Surveys provide an incomplete picture at best; they measure student feelings at a particular time, typically before the research is completed and the final project is turned in. In an effort to assess the transference of information literacy and research skills, librarians are starting to capitalize on opportunities to access students’ final projects, specifically their bibliographies. They hope, through citation analysis, to see a more complete picture of student skills in practice and determine, by the sources used, whether their interactions with these students have successfully achieved their objectives for student learning.

Kohn & Gordon and Ursin, Blakesley Lindsay & Johnson provide detailed literature reviews that describe the history of the citation analysis research strategy, both as a resource for collection development and for an assessment of information literacy instruction.1 Exploring a sample of undergraduate citation analysis projects published within the past ten to fifteen years, shows that analysis commonly occurs either on the first year level with freshman seminar or English composition courses2 or with a senior capstone or thesis.3 Some variance in the level of students can be seen, such as in Rosenblatt’s analysis of research from an upper divisional Sociology class, Knight-Davis and Sung’s analysis of work from an undergraduate student writing portfolio, and Mill’s evaluation of bibliographies from a variety of disciplines and divisions.4 Other times citation analysis can be used to answer a very specific question; for example Ursin, Blakesley Lindsay, & Johnson and their study to see if students utilized resource guides created for different sections of a course.5

Once librarians have access to student bibliographies, they must then decide the best way to extract meaning from the list of citations and interpret the information they provide. Often they do so by analyzing the number of sources and the source type: recording the number of journals, books, magazines, websites, or other source types in the bibliographies and/or recording their publication date.6 Or, they rate the “value” of the sources by creating an evaluation rubric.7 Relying on the citation lists to identify a source can be problematic and frequently research is hampered by students who have not mastered the required citation style, forcing librarians to make assumptions about the citation or exclude citations entirely.8 The citation style also may not identify whether the source (journal articles especially) is found via a library’s database or an online repository or publisher’s website.9 McClure, Cooke, & Carlin and Knight-Davis & Sung found evidence in their study that student in-text and bibliographic citations often do not match up, with sources used in the paper not showing up on the bibliography and vice versa.10

In addition, the librarian must consider the assignment requirements when analyzing the citations. If the faculty member set limits on specific types of sources, or required a certain source for the paper, that can skew the results, and limit the reliability of the citation list to provide answers for the librarian related to instruction.11

Once the sources have been analyzed, meaning must be put to their number and type. Even without a rubric, research studies may choose to emphasize scholarly sources, either with first year students to show that they retained some of the strategies or resources introduced in the instruction,12 or for the upper level courses as ways to see if students are utilizing discipline specific sources.13 With other studies the variety of sources is also seen as a positive indicator with a number of authors commenting explicitly on the variety of sources when discussing their research results.14

Lists of citations can be limiting when the librarian does not have access to, or is unable to include in the study, the full text of the student project. Appropriateness of a source, whether based on the source type or its value, cannot be completely determined until its context within the student’s work is considered as well. Being able to access how students utilize a source within a project allows the librarian to see the full circle of information literacy. Attempts to analyze how students utilized sources were conducted by Carlozzi’s research article on student synthesis15 and Rosenblatt’s article which considered the student’s paper topic when analyzing the citation synthesis.16

Research Consultations

Though citation analysis is common when evaluating in-class information literacy instruction, it is less so when assessing an individual research consultation program. In one example, Reinsfelder described a citation analysis study based on the drafts and final papers of students who completed consultations.17 The study utilized a rubric to evaluate cited sources, in addition to analyzing the synthesis of sources by reviewing the student’s entire paper. Reinsfelder’s results indicated that students who completed consultations used higher quality sources in their final paper than those who did not. In addition, the faculty felt that the consultations improved student work. Reinsfelder found similar limitations with citation analysis as those found in the other studies reviewed, including a lack of correct citations, and a correlation between faculty source guidelines and the type and number of sources used.18

More often research consultation assessments utilized surveys to analyze feedback from both the students19 and the librarians involved in the consultations.20 It is important that researchers understand student perceptions of the research consultation service along with the librarian’s satisfaction with the reference model.21 Overall, these studies found similar benefits and concerns related to research consultations. Student surveys highlighted positive experiences with consultations. Students valued the consultation with its face-to-face interactions22 and their ability to build relationships with the librarian.23 For librarians, consultations offer an opportunity to provide personalized instruction which fits the needs of the student24 and builds relationships,25 especially with first year students.26 Scheduled formal meetings allow the librarian more time to prepare for the meeting, especially if the research questions are more complex.27 In all, studies showed both librarians and students felt that the consultations were a valuable experience.28

Concerns with research consultations mainly came from the librarian side. They included the time spent preparing and meeting with the students29 and the challenges of trying to scale the service to reach a greater percentage of the student population.30 Another concern was ensuring that patrons receive appropriate information literacy instruction during the meeting and that they were not just given a list of resources.31 Interestingly, student surveys also indicated that students could feel overwhelmed with the information provided in the consultation, and often felt less confident with their ability to use the library resources for additional research.32 Faix, MacDonald, & Taxakis also found that first year students who completed the consultation earned lower grades, which could be related to feeling overwhelmed by the number of sources and not being able to effectively evaluate and utilize them.33

Libraries are still experimenting with the optimal way to provide research consultations. Some libraries offered consultations in addition to an instruction session,34 instead of an instruction session,35 or as an off-shoot of the reference desk service.36 Vilelle actually compared the use of written feedback on a bibliography assignment (exchanged via email) to an in-person research consultation.37 None of the literature reviewed treated the consultation in the same manner as the current study, where the consultations were part of a hybrid instruction session, and the consultations fit within the class time, sometimes in the assigned classroom.

Methods

Speech Assignment Source Requirements and Consultation Instruction Models

For this study, the librarians worked with three faculty members who had each developed a unique approach to library instruction for their informative speech assignments. The librarians’ meetings with students were scheduled in the first half of the semester, during the last two weeks of September. The speech presentations began a couple of weeks later and lasted through October. The exact timing varied among class sections.

The models below outline the source requirements for each professor and their preferred approach to the consultations. Consistent among all three was a prohibition on using encyclopedias, dictionaries, and other “ready-reference” as sources. Also consistent was a requirement that citations be presented in APA format.

Model A

Professor A’s informative speech assignment required five sources, only two of which could be websites. To assist students, Professor A also provided a list of approved journals and magazines for them to peruse for topics. The students were introduced to the speech assignment the week before the consultation and were given the task of finding an article of interest from the approved source list to bring with them to the consultations.

On consultation days, the entire class was required to be present in the classroom during their normally scheduled meeting time. Both librarians were present as well. Students met individually with either one of the librarians or the professor to discuss their topic. Based on their initial article, students received instruction on finding related resources using the library catalog and/or databases. Each consultation lasted approximately six minutes. If there was time before the end of the class, students were able to meet again with the professor or librarians if they had any follow up questions. Professor A’s informative speech presentations began about two weeks after the consultations in mid-October.

Professor A’s persuasive speech assignment requirements were almost identical to those for the informative speech assignment. Students needed at least five sources, and a maximum of two websites. As with the informative speech assignment, magazine or newspaper article served as inspiration for the speech topic. The persuasive speech presentations were scheduled for the end of November through the end of the semester, a little over a month after the informative speech presentations.

Model B

Professor B’s informative speech assignment required four of the five sources to have a physical version available (i.e. a print book or magazine/journal/newspaper) and explicitly encouraged students to use a variety of source types. The assignment also provided themes to help students narrow their topics. The assignment was introduced to students the previous week during class, and students signed up for a preferred consultation date/time. Students were also required to submit topic sentences via the learning management system before their meeting. Professor B reviewed each topic, providing suggested search terms and databases before meeting with the librarian. Consultations were held during the regular class meeting time that week.

On the consultation days, Professor B, one librarian, and groups of three or four students met in a library conference room. Professor B led a group discussion first, demonstrating searches within the library catalog using one of the students’ topics, and asked the librarian to demonstrate finding a book on the shelves. Afterwards, the students continued researching their topics on laptops as the professor and librarian met one-on-one with them to provide further guidance. Each group meeting lasted about 20 minutes; afterwards, a different group of students convened, and the process repeated. The informative speech presentations began about three weeks after the consultations in mid-October.

Professor B’s persuasive speech assignment was based on a non-profit organization found through the website Charity Navigator. For this speech, the number of sources increased to seven sources, only two of which could be websites. As with Professor A, the persuasive speech presentations also occurred in November, about a month after the informative speech presentations.

Model C

Professor C’s informative assignment required students to pick a non-profit organization, and educate the audience on a related topic. The required sources were more prescriptive than in the other models. Students needed at least five sources which included the non-profit’s website, one newspaper or magazine article, and three journal articles or books. As with Professor B, students received the speech assignment the week before the consultation and signed up for preferred consultation dates/times which would take place during their normal class meetings in the coming week. Unlike the others, this professor did not require students to submit a topic before their meeting or bring an article to the consultation.

As with Professor B, on the consultation days Professor C, a librarian, and a group of three or four students met in the library. A majority of the consultation time was spent discussing and refining topics. As time allowed, the librarian used sample topics from the group to demonstrate searches within the library catalog and databases. Very little one-on-one researching was done with the students. Each group meeting lasted about 20 minutes. These informative speech presentations began at the beginning of October, two weeks after the consultation meetings.

Professor C’s persuasive speech assignment requirements were also very specific. The five sources included a personal story (gathered through an interview or found in an article), two magazine or newspaper articles, and two academic journals or government sources. The persuasive speech presentations occurred in the middle of November, a little over a month after the informative speeches.

Structure of the Citation Analysis

The librarians received IRB (Institutional Review Board) approval to extract bibliographies from the informative and persuasive speech outlines provided by the faculty members. Any student in the twelve sections was able to withdraw from the study if they wished. Participation in the study was high. Of Professor A’s 70 students, the librarians received 68 informative biographies (97%) and 65 persuasive bibliographies (92%). From Professor B’s 84 students, they received bibliographies from 75 (89%) of the informative and 74 (88%) of the persuasive speeches. Finally, Professor C’s sections had a total of 76 students and submitted 60 (78%) informative bibliographies and 65 (85%) persuasive bibliographies. The librarians are not sure if the missing bibliographies are due to opting out of the study or because the students did not complete assignment.

Each individual bibliography was aggregated to create a list of sources (all citations) from assignments from each professor’s combined sections, separated by the type of speech assignment. Then, the librarians analyzed each citation and attempted to identify and label the source using the following labels:

  • Book included books, book chapters, and ebooks.
  • Journals included peer-reviewed journals and materials for an academic audience
  • Magazine included magazines, trade magazines and newsletters, created for a more general audience.
  • News included newspapers and broadcast/online news sources
  • Websites divided into four categories: Commercial, Educational, Governmental, or Organizational
  • Other included YouTube videos, interviews, DVDs, and personal websites and blogs.
  • Unidentifiable designated sources that could not be identified based on the citation

During this process the librarians removed general information sources such as encyclopedias or dictionaries, since each faculty members had specifically identified these sources as inappropriate for the assignments. Also removed from the study was Charity Navigator, since Professor B required the source for the persuasive speech, and it was not an original source for the students.

After the source types were categorized, the librarians attempted to determine whether the source was obtained using the library’s resources or was publicly available online. Many of the citations had clues as to where the students found the resource: either a URL or name of an originating website. If the citation listed a database, or database web address, it was easy to identify as from a library resource. If the citation was an APA citation for a magazine or journal without a source URL, it was marked as coming from the library’s databases (with the assumption that the students would use the electronic copy of an article instead of finding it in a paper journal). Print books were assumed to be retrieved from the library collection or through interlibrary loan. Ebooks were distinguished by their web addresses, and marked as either databases or online, depending on the URL. Though attempts were made to find all cited sources (by searching online and in our databases/library), some sources remain unidentifiable and were not categorized.

Results

The above process resulted in six pools of data that could be compared (three professors with two speech types each). Table 1 shows the culmination of the data for each speech by source type. Data was further delineated by whether the sources were accessed online through a search engine or through the library catalog and/or databases. Since the total number of citations varied between speech type and faculty member, percentages were calculated to help provide context to the number. The number of citations in that category was divided by the total number of citations for that speech/professor.

Table 1

Source Total Count and Percentage by Speech Type and Professor.

Professor A Professor B Professor C

Speech Type Informative Persuasive Informative Persuasive Informative Persuasive

Total % Total % Total % Total % Total % Total %

Source Types
Book/Chapter
    Database/Library 15 4% 15 4% 79 19% 2 0% 10 3% 2 1%
    Online 2 1% 0 0% 0 0% 5 1% 1 0% 2 1%
    Total 17 4% 15 4% 79 19% 7 1% 11 3% 4 1%
Journal Article
    Database/Library 124 32% 145 39% 95 23% 124 25% 80 24% 75 19%
    Online 9 2% 16 4% 6 1% 16 3% 17 5% 10 3%
    Total 133 34% 161 43% 101 24% 140 28% 97 29% 85 21%
Magazine Article
    Database/Library 60 15% 36 10% 47 11% 26 5% 42 12% 34 9%
    Online 23 6% 23 6% 28 7% 14 3% 16 5% 19 5%
    Total 83 23% 59 16% 75 18% 40 8% 58 17% 53 13%
News Source
    Database/Library 13 3% 2 1% 2 0% 12 2% 12 4% 9 2%
    Online 30 8% 33 9% 46 11% 42 9% 30 9% 48 12%
    Total 43 11% 35 9% 48 12% 54 11% 42 12% 57 14%
Websites
    Commercial 32 8% 18 5% 35 8% 14 3% 10 3% 32 8%
    Educational 12 3% 6 2% 4 1% 7 1% 3 1% 22 6%
    Government 18 5% 18 5% 14 3% 43 9% 8 2% 68 17%
    Organizational 27 7% 47 13% 46 11% 172 35% 98 29% 60 15%
    Total 89 23% 89 24% 99 24% 236 48% 119 35% 182 46%
Other 19 5% 5 1% 10 2% 14 3% 7 2% 11 3%
Unidentifiable 5 1% 7 2% 5 1% 3 1% 3 1% 4 1%
Total Citations 389 371 417 494 337 396
Average per Speech 5.64 5.8 5.42 6.59 5.27 6

Since each professor had specific source requirements for the speeches, it may also be informative to compare the source totals to the minimums outlined by the assignments. Table 1 starts the analysis, providing the average number of combined sources per speech. The students stuck close to the minimums per speech, for example all the informative speeches required five minimum sources, and they averaged close to that. The persuasive speeches of Professors B and C are interesting in that although students were only required to use five sources, the bibliographies average almost six sources. Sources cited in Professor B’s persuasive speech assignment also increased, but the minimum source requirement for those speeches was seven so the increase was to be expected.

Table 2 further explores the relationship of between the source numbers and the requirements by showing the average number of each source type per bibliography. For example, for Professor A’s informative speeches average only about one-third of a book per speech, but Professor B’s informative speeches average just over one book per speech. The total number of citations, average number of total citations per speech, and source requirements were added to the table for easy reference. These numbers will be discussed later in the paper.

Table 2

Citations by type compared to source requirements.

Professor A Professor B Professor C

Speech Type Informative Persuasive Informative Persuasive Informative Persuasive

No. of speeches 68 65 75 74 60 65

Total Avg per speech Total Avg per speech Total Avg per speech Total Avg per speech Total Avg per speech Total Avg per speech

Source Types
    Book/Chapter 17 0.30 15 0.24 79 1.05 7 0.09 11 0.18 4 0.06
    Journal Article 133 2.10 161 2.50 101 1.34 140 1.89 97 1.61 85 1.26
    Magazine Article 83 1.33 59 0.95 75 1 40 0.54 58 0.96 53 0.84
    News Source 43 0.69 35 0.56 48 0.64 54 0.73 42 0.7 57 0.88
    Websites 89 1.43 89 1.43 99 1.32 236 3.19 119 1.98 182 2.8
Total number of citations* 389 5.64 371 5.80 417 5.42 494 6.59 337 5.27 396 6
Source Requirements Min of 5 sources, most from databases, 2 websites max Same as informative speech Min of 5 high quality sources, 4 with physical versions available Min of 7 high quality sources, 5 with physical versions available Min of 5 sources: 3 journal/books, 1 newspaper/magazine, one nonprofit website Min of 5 sources: 2 academic articles/.gov websites, 2 magazines/newspapers, 1 personal story/interview

* Total number of sources includes the categories of “other” and “unidentifiable” not included in this table. See Figure 1.

The librarians framed their analysis within the context of their original research questions.

Research Question 1: How successful were instruction interactions with students?

The success of the instruction interactions was judged largely on the source types cited and the origin of these sources. The percentage of journal articles was consistently high throughout the six data sets. They were highest in Professor A’s speeches. Most of the bibliographies in the study averaged around two journal articles per speech, except for Professor B’s informative speech and Professor C’s persuasive speech. In addition, for Professor C’s persuasive speeches, journal articles were only 21% of the total sources used for that assignment. Websites had consistently high usage as well, with 48% of Professor B’s persuasive speech sources and 46% of Professor C’s persuasive speech sources coming from websites. Websites averaged over one per bibliography throughout all the data sets, even though only Professor B’s persuasive speech requirements allowed for more than one website to be used. The use of magazine articles was lower than expected, with the highest amount being 23% of Professor A’s Informative speech sources. News sources and book sources also had very low usage throughout.

Whether the sources came library resources or from other avenues was analyzed as well. A large majority of journal articles and books were found using library resources (databases or the catalog). A lesser distinction was made between the sources of magazine articles, and a majority of the news sources actually came from online. When the informative sources are aggregated, 54% of Professor A’s sources were from library resources, 53% of Professor B’s sources were from library resources, and 43% of Professor C’s sources were from library sources.

Research Question 2: Did students retain information literacy skills throughout the rest of the course?

Turning to the second research question’s concern with skill retention, the percentage of source types in the informative and persuasive speeches were compared. The assumption was that if students used library sources for their second speech assignment, it might indicate that they retained research skills demonstrated during the previous consultation meeting. For Professor A, usage of source types was similar between assignments, with a slight increase in the number of journal articles from library resources used in the second assignment (32% to 39%). A similar consistency was seen in the other classes, though Professor B’s student’s use of books/chapters sharply decreased between the two speech types (19% to 1%), and there was a large increase in the number of websites used (24% to 48%). One point of concern regarding retention of information literacy skills, though, is the overall increase of websites used. For Professor A’s bibliographies the average remained consistent at 1.43 per speech, but for the other two faculty members the average per speech increased, 1.32 to 3.19 for Professor A, and 1.98 to 2.8 for Professor C. Even with the change in requirements the increase felt meaningful.

Outlying numbers, like Professor B’s book and websites usage, led to a more detailed analysis of the sources in comparison to the assignment requirements. Professor B’s persuasive speech assignment focused on non-profit organizations, which may account for the large jump in organizational websites used, going from 11% to 35%. This may also account for the lack of book sources, which may not have been appropriate for the persuasive assignment. A similar effect can be seen in Professor C’s informative speech sources. Organizational websites make up 29% of the sources, reflecting the requirement that students focus their informative speech on a non-profit organization. Another notable difference in Professor C’s persuasive speech sources is a slightly higher use of government sources with 17% of sources being government websites. This could be attributed to the wording in the assignment; students were specifically required to use two journal articles or government sources. For another example, both of Professor A’s assignments utilized a magazine, journal, or newspaper article as an inspirational source, which resulted in 68% of both speeches’ citations were from those three source types, a higher percentage than in the other faculty’s classes.

Research Question 3: Is there one method of instruction that worked better overall?

The data among all six sets did not lead to any obvious conclusions regarding the instruction methods, the third research question for the study. The one exception seems to be the number of books/chapters used in Professor B’s informative speeches. This faculty member was the only one who had students specifically look up a book in the catalog, and had the librarian demonstrate finding a book on the shelves. These bibliographies included 79 books, comprising 19% of the sources used. None of the other data sets had more than 5% of their sources from books. There appears to be a correlation between having the students practice finding books in the consultation and a higher usage of books in the speeches. The sharp decrease of books used in the persuasive speeches by the same students may indicate a lack of comfort finding book materials without assistance. Or, it could be due to the difference between the informative and persuasive assignments.

Survey Results

After the consultations were completed, professors were given paper surveys for students to provide feedback on the consultations. Surveys were anonymous, and delivered to the librarians by the faculty member. A total of 164 surveys were received from the 12 sections. Tables 3 and 4 outline the aggregated survey results.

Table 3

Student survey aggregated results.

Did you find the one on one consultation helpful? Yes No

161 2
I was familiar with library resources before the consultation Agree Disagree

80 82

Table 4

Student survey aggregated results continued.

Survey Statement Strongly Agree Agree Neutral Disagree Strongly Disagree

I learned research techniques that will help me find useful resources for my topic 65 84 15 0 0
I feel comfortable contacting the library staff for additional help 71 70 20 2 2

The first two questions of the survey covered the students’ overall satisfaction with the consultation. An overwhelming number, 98%, found the consultation helpful. The second question helped gauge student familiarity with the library and its resources. Since it was a 100 level course and a majority of the students were first-years, it was not surprising that only 49% were familiar with library resources. The consultations occurred towards the beginning of the semester, whereas the library instruction in other first-year courses can be spread throughout the semester.

The final questions from the survey were included to gather feedback on students’ comfort and confidence with the research process. Ninety percent of students either strongly agreed or agreed that they learned techniques from the consultations. Comfort soliciting additional help from the library staff was slightly lower, with 85% agreeing or strongly agreeing with the statement. It was interesting to note that the four students who disagreed or strongly disagreed with that statement still felt that the session was helpful overall.

Of the 164 surveys received, 22 of them provided additional comments related to the consultations. A majority of the comments were positive, thanking the librarians for the assistance. As an example, one student wrote: “Very helpful to meet with the librarians, helped to find resources and ask questions otherwise left unanswered. Thank you.” Other comments expressed a desire for the consultations to be longer, and others asked for improvements to the website to make identifying useful resources and databases easier.

Discussion

This citation analysis study was designed to consider three research questions: 1) How successful were the instruction interactions with students, 2) Did students retain information literacy skills throughout the rest of the course, and 3) Is there one method of instruction that worked better for Oral Communication courses overall. Limitations encountered with the research methodology, however, ultimately meant that the librarians were unable to find definitive answers. These issues and limitations were consistent with those encountered by other librarians who have attempted citation analysis studies. Despite these limitations, the study allowed the librarians to draw some conclusions and identify some trends that may benefit from further investigation.

As with the other projects, the varying quality of citations created a hurdle for verifying citations. Students were asked to use APA citations, but often did not seem to understand how or why to use a citation style. They seemed especially to have difficulty identifying journal articles or magazine articles found online as articles rather than web pages. In addition, some citations did not include enough information (like a URL or a database name) to identify where the item was found. This forced an uncomfortable amount of assumption, where the librarians had to speculate whether a source was gathered through library databases or found through an internet search engine. Incomplete or incorrect citations also required additional time to identify sources, either through URLs provided in the citation or searching on the web for a source title or keyword in order to categorize the source type.

Categorizing web resources was also problematic. This study’s categories of commercial, organizational, educational, and news, were more detailed than some studies cited,38 but consistent with others.39 The nature of web content is ever evolving so it can be hard to categorize sites, which led to much discussion among the librarians in this study. If a site is a commercial website that aggregates content from other sites (such as lifenews.com), how should it be categorized? What about a commercial site that creates original content with the aim to drive up advertising revenue? For example, students cited puresight.com, an online child safety monitor, for topics on bullying. Categorizing the web resources seemed to require a need to understand the intention of the site and its goals. This became very close to evaluating the resources, which was not part of the scope of this study.

This study purposefully chose to avoid judging the reliability or appropriateness of the sources, mainly because the integration of these sources could not be identified based on the limited information provided in the outlines. The full text of the speeches, either in written or video format, were not available for the study. For the persuasive speeches specifically, it was hard to say whether a source was appropriate since it was not seen in context. Understanding the context of the sources and the students’ synthesis of the information, as the Carzolli or Rosenblatt did,40 would perhaps have led to more definitive answers. Such an analysis was beyond the scope of the material provided in this study and will need to be saved for another research project.

The analysis did highlight the importance of faculty assignments and grading rubrics and their effect on the students’ overall sources. In some instances, like with Professor C’s mention of government documents or Professor B’s focus on books, the professors’ expectations can been seen in the sources used. But, in other cases, as with the use of websites over scholarly articles, the students’ old habits seem to be harder to replace. It is important to see the bibliographies within the confines of the assignments when attempting a citation analysis. Some requirements, like the focus on non-profits for Professor C’s informative speech and Professor B’s persuasive speech, can cause an artificial inflation in a particular source type. The requirements to use a source, like Charity Navigator, should be clearly shared with the researcher, as they may want to remove it from the data.

When the librarians tried to draw conclusions from the findings based on their initial research questions, they found imperfect, but promising, results.

Question 1: How successful were the instruction interactions with students?

To answer this question, the librarians reviewed resources from the informative speeches, which were the assignments tied to the research consultation sessions. In all, the students used a higher percentage of journal articles and books than was expected, which led the librarians to conclude that the students may have felt comfortable using the sources identified during the consultations, and/or may have even continued to look in the library’s catalog and databases for additional sources. The survey results from the students also indicated a high satisfaction with the consultations. Students appeared to identify the value of consultations, which is consistent with other research studies looking at consultations, and thanked the librarians for their help. Though the librarians did not have any comparable data for informative speeches from classes who did not participate in library instruction, the use of library resources and affirmation of the value of research consultation leads to the conclusion that there was some initial success with the program.

Question 2: Did the students retain information literacy skills throughout the rest of the course?

When comparing the sources used in the informative and persuasive speeches, it is important to remember that the appropriate sources for each speech type could vary, and that the information provided did not allow for comment on how the student utilized each source. While the inability to comment on the appropriateness of the sources limits the study’s ability to judge students’ information literacy skills, we still could get some information from the source types used. The librarians ultimately did see consistent numbers of library resources used in both the informative and persuasive speeches. This shows that students were at least able to retain some familiarity with the library resources.

We do not have any data indicating the number of students who voluntarily worked with a librarian for their persuasive speeches, another area for future research. Anecdotally, based on librarian recollection and reference desk analytics, the number was low. This could mean anything from that the students felt comfortable researching alone for their second speech, they felt uncomfortable initiating help from the librarian outside of class, or they did not manage their time well enough to allow for a visit to the library.

Question 3: Is there one method of instruction that works better for Oral Communications?

The citation analysis did not show any major difference between the instruction styles utilized by the three faculty members. This study did illustrate the impact of the way that faculty frame an assignment on the sources that students ultimately cite. For example, the faculty member that focused more on book items showed an increase on book sources in the citations. The faculty member who specifically mentioned government sources as a potential source had an increase in the number of those types. The dynamic of having the faculty member participate in the consultations, and prepare the students for the consultations, seemed to help focus the students on the information provided. While these findings will not necessarily change the way that librarians interact with students during the consultations, it is good information to share with the faculty as they develop assignments and balance the class time they spend preparing the students prior to instruction.

Conclusion

Overall, the findings of this project were positive, but inconclusive. Citation analysis can be a safe, inconspicuous window into a student’s research habits, but this research project highlighted some of the limitations with this methodology. Outside factors beyond the librarian’s control, such as assignment requirements or inability to follow correct citation guidelines, seemed to affect the sources used and thus the citation analysis. A different research strategy, such as a source synthesis analysis, might have been a better fit to answer the research questions, but the citation analysis was a good first step in attempting a formal review of instruction efforts.

One major achievement for the citation analysis study was, as Kohn and Gordon discussed, that such a project provided an avenue for discussion with faculty.41 Librarians and faculty are not always on the same page when it comes to research assignments, and faculty may not analyze student citations to be sure that they are using the discipline sources that faculty prefer.42 Citation analysis can provide evidence that will allow librarians to point to areas of improvement, and illustrate to faculty the impact their assignment requirements have on the students. The will of the faculty, and their assignment design, can eclipse the library’s instructional goal of balanced and thoughtful research.

The data did provide some insights. There was a consistent use of library materials through the informative and persuasive speeches which may indicate a comfort for students in using library materials. In addition, an overwhelming majority of the students found the sessions helpful, so the consultations were a positive experience for the students. The librarians can conclude, and share with the faculty, that the time spent inside and outside the classroom is valuable and has some positive impact on the student’s research skills.

Notes

1Karen C. Kohn and Larissa Gordon, “Citation Analysis as a Tool for Collection Development and Instruction,” Collection Management 39, no. 4 (2014): 275–296, https://doi.org/10.1080/01462679.2014.935904; Lara Ursin, Elizabeth Blakesley Lindsay, and Corey M. Johnson, “Assessing Library Instruction in the Freshman Seminar: a Citation Analysis Study,” Reference Services Review 32, no. 3 (2004): 284–292, https://doi.org/10.1108/00907320410553696. 

2Alan Carbery and Sean Leahy, “Evidence-Based Instruction: Assessing Student Work Using Rubrics and Citation Analysis to Inform Instructional Design,” Journal of Information Literacy 9, no. 1 (2015): 74–90. https://doi.org/10.11645/9.1.1980; Michael J. Carlozzi, “They Found it – Now Do They Bother? An Analysis of First-Year Synthesis,” College & Research Libraries 79, no. 1 (2018): 659–670. https://doi.org/10.5860/crl.79.5.659; Rachel Cooke and Danielle Rosenthal, “Students Use More Books After Library Instruction: An Analysis of Undergraduate Paper Citations,” College & Research Libraries 72, no. 4 (July 2011): 332–343. https://doi.org/10.5860/crl-90; Randall McClure, Rachel Cooke, and Anna Carlin, “The Search for the Skunk Ape: Studying the Impact of an Online Information Literacy Tutorial on Student Writing,” Journal of Information Literacy 5, no. 2 (2011): 26–45. https://doi.org/10.11645/5.2.1638; Ursin, Blakesley, and Johnson, “Assessing Library Instruction,” 284–292. 

3Elizabeth Gadd, Andrew Baldwin, and Michael Norris, “The Citation Behaviour of Civil Engineering Students,” Journal of Information Literacy 4, no. 2 (2010): 37–49, https://doi.org/10.11645/4.2.1483; Kohn and Gordon, “Citation Analysis,” 275–296. 

4Stephanie Rosenblatt, “They Can Find It, but They Don’t Know What to do with It: Describing the Use of Literature by Undergraduate Students,” Journal of Information Literacy 4, no. 2 (2010): 50–61. https://doi.org/10.11645/4.2.1486; Stacey Knight-Davis, and Jan S. Sung, “Analysis of Citations in Undergraduate Papers,” College & Research Libraries 69, no. 5 (September 2008): 447–458. https://doi.org/10.5860/crl.69.5.447; David H. Mill, “Undergraduate Information Resource Choices,” College & Research Libraries 69, no. 4 (July 2008): 342–355. https://doi.org/10.5860/crl.69.4.342. 

5Ursin, Blakesley Lindsay, and Johnson, “Assessing Library Instruction,” 284–292. 

6Cook and Rosenthal, “Students Use More Books,” 332–343; Mill, “Undergraduate Information Resource Choices,” 342–355; Knight-Davis and Sung, “Analysis of Citations,”447–458; McClure, Cooke and Carlin, “The Search for Skunk Ape,” 26–45; Gadd, Baldwin, and Norris, “The Citation Behaviour,” 37–49. 

7Cardbery and Leahy, “Evidence-Based Instruction,” 74–90; Ursin, Blakesley Lindsay, and Johnson, “Assessing Library Instruction,” 284–292. 

8Gadd, Baldwin, and Norris, “The Citation Behaviour,” 47; Knight-Davis and Sung, “Analysis of Citations,”449–450; Ursin, Blakesley Lindsay, and Johnson, “Assessing Library Instruction,” 290. 

9Mill, “Undergraduate Information Resource Choices,” 349; Knight-Davis and Sung, “Analysis of Citations,”454; Kohn and Gordon, “Citation Analysis,” 282. 

10Knight-Davis and Sung, “Analysis of Citations,”449–450; McClure, Cooke and Carlin,” The Search for Skunk Ape,” 38. 

11Knight-Davis and Sung, “Analysis of Citations,”457; McClure, Cooke and Carlin, “The Search for Skunk Ape,” 36; Cook and Rosenthal, “Students Use More Books,” 339; Thomas, Reinsfelder, “Citation Analysis as a Tool to Measure the Impact of Individual Research Consultations,” College & Research Libraries 73, no. 3 (May 2012): 275. https://doi.org/10.5860/crl-261. 

12McClure, Cooke and Carlin, “The Search for Skunk Ape,” 26–45; Cook and Rosenthal, “Students Use More Books,” 332–343. 

13Kohn and Gordon, “Citation Analysis,” 275–296. 

14Gadd, Baldwin, and Norris, “The Citation Behaviour,” 37–49; Cardbery and Leahy, “Evidence-Based Instruction,” 74–90; McClure, Cooke and Carlin,” The Search for Skunk Ape,” 26–45; Knight-Davis and Sung, “Analysis of Citations,”447–458. 

15Carlozzi, “They Found it,” 659–670. 

16Rosenblatt, “They Can Find It,” 50–61. 

17Reinsfelder, “Citation Analysis as a Tool,” 263–277. 

18Reinsfelder, “Citation Analysis as a Tool,” 271. 

19Crystal D. Gale and Betty S. Evans, “Face-to-Face: The Implementation and Analysis of a Research Consultation Service,” College & Undergraduate Libraries 14, no. 3 (2007); 85–101. https://doi.org/10.1300/J106v14n03_06; Luke Vilelle, “Personalizing Instruction and Reference: A Personal Attempt to Close the Public Service Gap,” College & Undergraduate Libraries 21, no. 1 (2014): 37–55. https://doi.org/10.1080/10691316.2014.877734; Trina J. Magi and Patricia Mardeusz, “Why Some Students Continue to Value Individual, Face-to-Face Research Consultations in a Technology-Rich World,” College & Research Libraries 74, no. 6 (November 2013): 605–618. https://doi.org/10.5860/crl12-363; Allison Faix, Amanda MacDonald, and Brooke Taxakis, “Research Consultation Effectiveness for Freshman and Senior Undergraduate Students,” Reference Services Review 42, no. 1 (2014): 4–15. https://doi.org/10.1108/RSR-05-2013-0024. 

20Gale and Evans, “Face-to-Face,” 85–101. 

21Ibid. 

22Gale and Evans, “Face-to-Face,” 92; Vilelle, “Personalizing Instruction,” 47. 

23Magi and Mardeusz, “Why Some Students,” 612. 

24Gale and Evans, “Face-to-Face,” 89–90; Faix, MacDonald, and Brooke, “Research Consultation Effectiveness,”6; Vilelle, “Personalizing Instruction,” 47. 

25Reinsfelder, “Citation Analysis as a Tool,” 275. 

26Vilelle, “Personalizing Instruction,” 38. 

27Gale and Evans, “Face-to-Face,” 90. 

28Gale and Evans, “Face-to-Face,” 95. 

29Gale and Evans, “Face-to-Face,” 96; Vilelle, “Personalizing Instruction,” 48. 

30Gale and Evans, “Face-to-Face,” 88. 

31Gale and Evans, “Face-to-Face,” 89; Faix, MacDonald, and Brooke, “Research Consultation Effectiveness,”11. 

32Gale and Evans, “Face-to-Face,” 92. 

33Faix, MacDonald, and Brooke, “Research Consultation Effectiveness,”11. 

34Vilelle, “Personalizing Instruction,” 37–55. 

35Vilelle, “Personalizing Instruction,” 37–55; Faix, MacDonald, and Brooke, “Research Consultation Effectiveness,”4–15. 

36Gale and Evans, “Face-to-Face,” 85–101; Magi and Mardeusz, “Why Some Students,” 605–618. 

37Vilelle, “Personalizing Instruction,” 37–55. 

38Gadd, Baldwin, and Norris, “The Citation Behaviour,” 37–49; Knight-Davis and Sung, “Analysis of Citations,”447–458; Kohn and Gordon, “Citation Analysis,” 275–296; Mill, “Undergraduate Information Resource Choices,” 342–355. 

39Cook and Rosenthal, “Students Use More Books,” 332–343; McClure, Cooke and Carlin,” The Search for Skunk Ape,” 26–45. 

40Carlozzi, “They Found it,” 659–680; Rosenblatt, “They Can Find It,” 50–61. 

41Kohn and Gordon, “Citation Analysis,” 293–294. 

42Ibid. 

Competing Interests

The authors have no competing interests to declare.

comments powered by Disqus