Published in College & Research Libraries, Volume 77, Issue 6, November 1, 2016, pages 682-702.
The definitive version is available at https://doi.org/10.5860/crl.77.6.682.
Expertise in searching and evaluating scientific literature is a requisite skill of trained scientists and science students, yet information literacy instruction varies greatly among institutions and programs. To ensure that science students acquire information literacy skills, robust methods of assessment are needed. Here, we describe a novel tool for longitudinal, crossover assessment of literature-searching skills in science students and apply it to a cross-sectional assessment of literature-searching performance in 145 first-year and 43 senior biology majors. Subjects were given an open-ended prompt requiring them to find multiple sources of information addressing a particular scientific topic. A blinded scorer used a rubric to score the resources identified by the subjects and generate numerical scores for source quality, source relevance, and citation quality. Two versions of the assessment prompt were given to facilitate eventual longitudinal study of individual students in a crossover design. Seniors were significantly more likely to find relevant, peer-reviewed journal articles, provide appropriate citations, and provide correct answers to other questions about scientific literature. This assessment tool accommodates large numbers of students and can be modified easily for use in other disciplines or at other levels of education.
Library and Information Science
Number of Pages