Scholars develop method to improve students' evaluation of information credibility



LAWRENCE — We live in an era in which misinformation is readily available and seemingly credible sources easily share unreliable information quickly. With dubious information so prevalent, it is especially pressing for students to be able to fully evaluate sources of information, especially those who will work in the media and mass communications. Two University of Kansas scholars have developed an assessment that shows the extent to which students improve their ability to evaluate source credibility that is adaptable across disciplines.

Journalists have frequently been accused of spreading fake news, while fake news does readily circulate on social media. Students often have not received rigorous training in how to verify the credibility of sources before reaching college. Peter Bobkowski, associate professor of journalism & mass communications, and Karna Younger, assistant librarian and open pedagogy librarian in KU Libraries, adapted a source evaluation rubric they use to measure if students improved both the amount and quality of their evaluative steps taken to verify the credibility of a source. Their study was published in the journal College & Research Libraries.

“Misinformation and information accuracy have been on people’s minds more since 2016, but information science and library scholars have studied these concepts for a long time, so it was natural for us to partner with them,” Bobkowski said. “Most of us in our daily lives are satisfied with how superficially we evaluate information. We don’t always have time to evaluate everything we read. So when we train students to stop and think about the cues that contribute to how credible something is, we are asking them to develop a counterintuitive habit. They often start at a pretty basic level, but our assessment shows us that they improve over the course of the semester.”

Bobkowski and Younger adapted an existing evaluation assessment originally designed in library science for students in a journalism class. The assessment consists of students reading an article and explaining whether it is a credible source of information. Later in the semester, after receiving instruction on the evaluation process and practicing it in different contexts, students again read a similar article and examined its credibility. Students’ responses are scored for the number of cues they use in their examinations and the depth of their analysis.

Bobkowski and Younger’s article discusses results from the first semester they deployed the assessment. Early in the semester, students’ responses lacked evidence for their rationale on whether a source was credible and tended to use ritualized language, such as saying someone was biased or not credible, but without providing sufficient evidence. At the end of the semester, students’ evaluation breadth, or the number of credibility cues they cited, stayed roughly the same, but their evaluation depth improved dramatically. In other words, after learning about the importance of looking further into the author, publication, sources and language of an article, students who used the technique explained why they did so and were better able to determine credible information and support their claims with evidence.

“Being able to evaluate the credibility of information is a marker of information literacy, an educational reform movement in library science,” Younger said. “Information literacy aims to equip students with a set of interconnected concepts to think critically and ethically about the information they consume and create. In other words, librarians don’t want to simply show students how to click in a library database and trust whatever they find. We want to cultivate students’ abilities to evaluate, use and create information well after they graduate. This project demonstrates how such concepts can be integrated into courses at KU.”

The authors’ adoption of a credibility evaluation assessment was part of a larger process of revising an information-gathering journalism class. The established rubric the authors adapted had to be redesigned for a journalism class. Students have long been evaluated on the credibility of their information by the sources they list in the bibliography sections of research papers. However, most journalism and mass communications student assignments are unique in that they do not include dedicated source sections. Rather, journalists cite their sources in the body of their work. Given this difference, the authors had to redesign the assessment to work with assignments that do not include bibliography sections.

It is easy today to end up in a “filter bubble,” Bobkowski said, in which the information people receive is curated for them by online algorithms based on their interests and the people with whom they connect online. This can lead to a tendency for people to assume that information that aligns with their opinions is automatically credible and to dismiss information that challenges them. This cuts across professions and demographics, making the ability to evaluate sources and share accurate information especially important. People in nearly every profession communicate information in one way or another, and an ability to discern credible sources can be beneficial across academic disciplines.

Information literacy is a set of interrelated threshold concepts, as defined by the Association of College and Research Libraries. Like many threshold concepts, information literacy is a framework designed for but not bound to library science. To that end, Bobkowski and Younger cite engineering and business scholars who have successfully used information literacy threshold concepts similar to those in the assessment to think in new ways about a concept or discipline. In doing so, students gain a deeper understanding of interrelated concepts that allows them to perform deeper considerations and evaluations of information they use in their work. The assessment could easily be adapted to social science, English, business or other classes.

“Information literacy is an incredibly broad framework adaptable across disciplines,” Younger said. “The integration of information literacy into an assignment, a course or the curriculum has been proven to improve student performance and retention. KU Libraries have a number of programs, such as our Research Sprints and mini-grant programs, to partner with instructors to adapt information literacy to their field. Of course, we are always happy to chat with folks about integrating information literacy into their teaching.”

Bobkowski and Younger have used the assessment over several semesters, and it has consistently shown to improve the depth and breadth of students’ information assessment. They also wrote “Be Credible: Information Literacy for Journalism, Public Relations, Advertising and Marketing Students,” an open textbook that includes lessons applied in the assessment and others to help students better evaluate information sources and create reliable information. The open educational resource was created with support from KU Libraries Research Sprints initiative and its OER grant initiative. It is openly accessible and freely available.

“These are skills that have to be practiced for students to be more efficient with them and for them to use in non-school or professional settings,” Bobkowski said. “I think the assessment has shown that our students recognize the benefit of these skills.”

Image credit: Pexels.com

Tue, 09/01/2020

author

Mike Krings

Media Contacts