Teaching Information Evaluation With The Five Ws

Download as pdf or txt
Download as pdf or txt
You are on page 1of 14

Feature

Teaching Information Evaluation with


the Five Ws
An Elementary Method, an Instructional
Scaffold, and the Effect on Student Recall
and Application
Rachel Radom and Researchers developed an information efficiently assist students in the acqui-
Rachel W. Gammons evaluation activity used in one-shot library sition and application of information
instruction for English composition classes. evaluation skills. The desired frame-
Rachel Radom is Instructional The activity guided students through evalu- work would be memorable, familiar
Services Librarian for Undergraduate ation using the “Five Ws” method of inqui- to students, scalable (used in face-to-
Programs, University of Tennessee ry (who, what, when, etc.). A summative face sessions or asynchronous, online
Libraries, Knoxville, Tennessee. assessment determined student recall and instruction), and valuable to course
Rachel W. Gammons is Learning application of the method. Findings, consis- instructors.
Design Librarian, McNairy Library tent over two semesters, include that 66.0 The following study introduces an
and Learning Forum, Millersville percent of students applied or recalled at information evaluation method based
University, Millersville, Pennsylvania. least one of the Five Ws, and 20.8 percent on a well-known framework of in-
of students applied or recalled more than quiry—the “Five Ws,” or who, what,
one of its six criteria. Instructors were also when, where, why, and how. Research-
surveyed, with 100 percent finding value ers modified the Five Ws to create a
in the method and 83.3 percent using or formative assessment that introduced
planning to use it in their own teaching. evaluation skills to students and piloted

U
it in fall 2011 during one-shot library
ndergraduate instruction li- instruction sessions for English compo-
brarians face the common sition classes. Full implementation fol-
challenge of addressing a lowed in fall 2012. In both the pilot and
wide variety of information formal study, a summative assessment
literacy competencies in sessions that was sent to students an average of three
follow short, one-shot, guest lecturer weeks after the library session to assess
formats. Of these competencies, one recall and application of the evaluation
of the most complicated and time- method. Composition instructors were
consuming to teach is the evaluation also surveyed to assess their responses
Reference & User Services Quarterly, of information sources. It can also be to the Five Ws evaluation method and
vol. 53, no. 4, pp. 334–47 one of the most difficult competencies determine whether they had added, or
© 2014 American Library Association.
All rights reserved. for students to effectively learn.1 In this would consider adding, the method to
Permission granted to reproduce for study, the researchers aimed to find their own instruction. The findings of
nonprofit, educational use. or develop a framework that would these assessments may be relevant to

334 Reference & User Services Quarterly


Teaching Information Evaluation with the Five Ws

instruction librarians and composition instructors, as well distinctions between information sources less discrete.
as those interested in the connections between information As the information landscape undergoes radical shifts,
literacy competencies and student learning outcomes in gen- librarians’ approaches to teaching information literacy and
eral education. information evaluation have remained relatively static. Ap-
proximately ten years ago, two information evaluation meth-
ods associated with different mnemonic devices were shared
Literature Review in the library literature and were subsequently incorporated
into many library instruction sessions. In 2004, Blakeslee
In 2000, the Association of College and Research Libraries described the motivation behind designing California State
(ACRL) published the “Information Literacy Competency University Chico’s CRAAP Test as a desire to create a memo-
Standards for Higher Education.”2 Intended to facilitate the rable acronym because of its “associative powers.”9 Intended
development of lifelong learners, the standards outline the to guide users through evaluating the Currency, Relevance,
skills needed for students to identify an information need and Authority, Accuracy, and Purpose of a document, the method’s
then locate, evaluate, and utilize resources to fulfill that need.3 accompanying checklist and questions can be applied to
For more than a decade, the ACRL guidelines have directed both print and online resources; however, its emphasis on
the library profession’s approach to instruction, shaping the the evaluation of electronic materials has resulted in a loose
ways that librarians conceptualize, design, provide, and as- categorization of the method as a website evaluation tool.10
sess library instruction. Corresponding to the widespread In contrast, the CRITIC method was incorporated into library
adoption of these standards, there has been an increase in re- instruction as a tool to be utilized in the evaluation of print
search investigating students’ skills (or lack thereof) in critical resources.11 In a presentation on the method at a 2004 confer-
thinking, and more specifically, information evaluation. The ence, Matthies and Helmke describe CRITIC as a “practical
majority of these research studies, however, are based on the system of applied critical thought”; repurposing the steps of
evaluation of web and print sources as separate materials. As the scientific method, it encourages users to approach evalu-
the numbers of online and open access publications increase ation as an iterative process and to interrogate the Claim,
and the boundaries between formats of information recede, Role of the Claimant, Testing, Independent Verification, and
the depiction of print and electronic resources as existing in Conclusion of a given document.12
distinct and separate categories does not accurately reflect the Both the CRAAP Test and CRITIC method attempt to sim-
modern search experience.4 It is also misleading to students plify the evaluation process by breaking down complex ideas
who are used to accessing a variety of media and information into a set of accessible criteria, but little research has been
sources in multiple formats. conducted on the effectiveness of the methods themselves.
Student confusion about the format and quality of infor- However, one recently published study on the advantages
mation sources is substantiated by recent research. In a 2009 of formative assessment in information literacy instruction
report for the United Kingdom’s Joint Information Systems includes a series of anecdotal observations that may provide
Council (JISC), researchers identified a dissonance between insight into the effectiveness of the CRAAP Test.13 Following
college and university students’ expectations of published an instruction workshop in which the test was taught, many
research and the realities of those bodies of work.5 When students self-reported a persisting difficulty with “determin-
asked what types of information a student would recognize ing the quality of different sources.”14 The authors found
as “research,” an overwhelming majority (97 percent) iden- that some students continued to have trouble “distinguish-
tified traditional formats such as books and articles. When ing between popular magazines and scholarly journals” and
confronted with less well-known formats, such as posters or “finding authoritative websites” even after follow-up consul-
dissertations, the number of students willing to identify the tations.15 Their findings suggest that the CRAAP Test may
documents as “research” greatly decreased.6 Additional quali- not effectively bridge the gap between determining easily
tative results describing student confusion were obtained in identifiable qualities, such as date of publication, and those
small focus group sessions. While the majority of students that require a greater level independent judgment and critical
“distrusted” the Internet, they widely accepted “all published thinking, such as authority, especially if used in only a single
materials” as appropriate for academic use.7 This inaccurate instruction session.
distinction between the credibility of print and electronic Meola contends that it is problematic to use models such
resources was also reported in research by Biddix et al., who as CRAAP and CRITIC to teach information evaluation be-
found that students view the information available from an cause of their structural dependence on linear processes and
academic library as “vetted” or “pre-accepted.”8 Students have checklists.16 He describes such checklist-based models as
oversimplified relationships between publication format, “question-begging” and criticizes them for offering “slim guid-
library resources, and credibility, a situation that has been ance” as to how the questions should be answered.17 Meola
further complicated by the increase in federated search tools. also argues that a linear organization encourages students to
Although federated searching may simplify the research ex- view evaluation as a “mechanical and algorithmic” process,
perience, it also increases the quantity of unfamiliar materials thereby separating “higher level judgment and intuition”
to which students are exposed, while simultaneously making from the evaluation process.18 Bowles-Terry et al. expand on

volume 53, issue 4 | Summer 2014 335


Feature
Meola’s ideas, writing that the checklist approach “reduces valued by course instructors to the extent that they would
critical thinking about the value of information to easily mem- incorporate the method into their own classes after the library
orized and superficial criteria.”19 The solution, the authors session. An evaluation method that met these ideal qualities
suggest, is to reconceptualize the evaluation of information would then have the potential to be more fully integrated into
as a meaningful process rather than a “look up skill.”20 Librar- a student’s greater learning process by surpassing the limita-
ians can support this by broadening the evaluation methods tions of one-shot instruction sessions.
they teach to include contextualizing a document within a
student’s “wider social experience.”21
Bowles-Terry et al. also encourage information literacy Methods
instructors to enhance their teaching efforts by incorporat-
ing aspects of social constructivist theory, developed in large At the University of Tennessee Knoxville, the first-year com-
part by Lev Vygotsky.22 In his preeminent writings on child position program includes two sequential courses, English
psychology, Vygotsky made highly influential contributions 101 and 102. Although the common syllabus for English 101
not only to sociological but also educational theory, including includes three standardized composition assignments, only
the concept of the “zone of proximal development,” or ZPD, one of these, the argumentative paper, requires students to
which he describes as the distance between what a learner can cite outside sources. Despite the applicability of library in-
accomplish independently and what he or she can accomplish struction to the composition curriculum, not all composition
under the “guidance of an adult or in collaboration with more sections attend a library instruction session. In fall 2011 and
capable peers.”23 According to Vygotsky, a learner’s transition 2012, an average of 24 percent of all English 101 sections
to a more advanced skill set or level of thinking is facilitated requested library instruction, while 70 percent of instructors
in collaboration with a person or group of people at a higher for English 118 (an Honors course that combines English
developmental level than the learner.24 101 and 102) requested library instruction for a similar as-
Related to the ZPD is the educational theory of instruc- signment.
tional scaffolding, a process by which a tutor or instructor Although the argumentative assignment does not require
helps a learner successfully achieve a task that the learner scholarly sources, many composition instructors encourage
would be unable to accomplish alone, thus spanning the ZPD. their students to cite sources with differing points of view. As
Scaffolding processes assist learners by building on behaviors a result, librarians dedicate a significant portion of the cor-
and tasks they have already mastered to achieve those that responding library instruction session to the development of
require higher levels of thought. In a seminal work on scaf- information evaluation skills. To facilitate this process, an in-
folding, Wood, Bruner, and Ross write that scaffolding begins structional services librarian and a graduate teaching assistant
when a tutor actively interacts with learners and controls the (both hereafter referred to as “the researchers”) sought to em-
“elements of a task initially beyond the learner’s capacity.”25 ploy an in-class evaluation activity that could be consistently
According to Bruner, responsive tutors gradually remove their used in each 101/118 library session, and would accomplish
support (the scaffold) as learners develop skills and need less two aims. First, the activity should effectively introduce
assistance.26 By working with instructors or more competent students to an information evaluation method. Second, the
peers, learners who successfully negotiate skill development evaluation method itself should be conducive to student recall
are then able to build on their accomplishments by achiev- and application after the library session.
ing the component steps of a process individually and then The researchers first identified an evaluation method and
progressing to skills of greater intellectual complexity. created the in-class evaluation activity, which was completed
Vygotsky theorized that learners may surpass their devel- in small groups during the instruction session and served as
opmental level by working with others more capable, while a formative assessment. A post-session summative assessment
Wood, Bruner, and Ross found that learners are capable of measured student application and recall of the evaluation
recognizing good solutions to a task or problem before they method. To determine composition instructors’ responses to
are capable of completing the steps needed to reach that solu- the session and, in particular, if those instructors found the
tion by themselves.27 These theories are useful to consider in evaluation method valuable or would consider adding it to
the design of information literacy instruction and formative their own teaching repertoire, the researchers also created a
learning assessments. Integrating group work into instruction follow-up survey for composition instructors. With approval
sessions may help learners achieve more success together than from the Institutional Review Board, the researchers piloted
if they were to work alone. Utilizing instructional scaffolds the assessments in fall 2011 and implemented them with
may also assist learners in the development of new skills. Fur- post-pilot improvements in fall 2012.
thermore, if the scaffold helps students accomplish goals that When selecting an information evaluation method, re-
they recognize as purposeful and relevant to their near-future searchers searched for a tool that would serve as an instruc-
success, they may be more invested in developing the skills tional scaffold.28 Rather than introducing students to a new
and learning the process being taught. Based on these criteria, evaluation method, the researchers hypothesized that intro-
a useful evaluation method in library instruction would be ducing students to a method based on a concept with which
associated with something already familiar to students and they were already familiar would have several benefits: It

336 Reference & User Services Quarterly


Teaching Information Evaluation with the Five Ws

might allow students to grasp the evaluation criteria more During the instruction session, the Five Ws activity was
quickly, interpret the steps involved more effectively, and presented to students as an online worksheet, managed and
reduce the number of clarifying questions necessary before maintained in the UT Libraries’ SurveyMonkey account
launching into the activity and applying the method. If such (appendix A). A link to the activity, as well as a PDF of the
benefits were actualized, the instructional scaffold would also document that students evaluated, was available on all li-
facilitate an efficient use of time for library instructors, who brary computers used in instruction sessions. The evaluated
were operating under the time constraints of either a fifty- or document was a column by Nicholas Kristof about the 2011
seventy-five-minute session. Tōhoku earthquake, tsunami, and Fukushima nuclear radia-
Between CRAAP and CRITIC, the two methods popular tion leaks in Japan, which appeared in PDF as a full-page
in library instruction, only CRITIC is associated with a con- from The New York Times opinion section.30 The decisions to
cept first-year university students might have encountered have all students evaluate the same document, and for them
in previous learning experiences as its steps are based in the to analyze a column rather than an article, were deliberate,
scientific method, a process taught in most elementary and based on observations from and results of the pilot study.
secondary schools.29 However, while the method’s guiding Analyzing an opinion piece challenged students without mak-
questions may seem familiar, terms associated with the sci- ing the exercise aggravating and, consequently, presented the
entific method are not mirrored in the words of the acronym, best opportunity for student learning.31
thereby making it appear new to users. To facilitate the ef- In the library session, students were directed to skim
fectiveness of the scaffold, researchers also wanted to teach a Kristof’s column, which was referred to by the researchers
“catchy” evaluation method, that is, easily remembered and as neither a “column” nor an “article,” but simply the “docu-
effectively recalled. Though this specific study did compare ment.” After skimming the document, students were asked
student recall of different evaluation methods, anecdotal to work in small groups of two to five to evaluate it using the
conversations between library colleagues revealed that the Five Ws criteria via the online worksheet. They were also di-
CRAAP and CRITIC criteria were difficult for library instruc- rected and encouraged to use Internet search engines to help
tors to remember. While many of the researchers’ colleagues them complete the evaluation, for example, to find more in-
had utilized the methods more than once in previous infor- formation about the author, his work, and his previous pub-
mation literacy sessions, few were able to recall the compo- lications. After completing the activity, researchers asked each
nents of either acronym. group to explain to the class how each of the Ws contributed
Therefore, in the interests of familiarity and memorabil- to their group’s final decision of whether they would or would
ity, the researchers looked outside of library literature. They not cite the column in a college research paper.
selected what is colloquially known as the “Five Ws” method During the fall 2011 pilot, researchers tested the Five Ws
of inquiry as a foundation for the activity and subsequent activity with an estimated 682 students.32 Results of the pilot
study. The method is composed of six guiding questions: study prompted researchers to make several minor adjust-
who, what, when, where, why, and how. Frequently taught ments to the Five Ws activity, including simplifying the phras-
in primary schools as introduction to basic rhetoric, the Five ing of some questions, choosing to evaluate a single document
Ws method is often associated with journalistic investigations rather than multiple types in one section, and adding links
and authorship. The likelihood that students would have to definitions for several terms, such as methodology, with
been introduced to the Five Ws criteria at an early age satis- which students had struggled. After the pilot project, the im-
fied the desire of the researchers to present a method with proved Five Ws activity was incorporated into many 101 and
which students were already accustomed, while the guiding 118 library instruction sessions. An estimated 391 students
questions provided a framework of interrogation on which in small groups participated in the fall 2012 research study.33
the researchers could build a more complex activity. The pilot study also included a post-session survey, de-
Using its six basic questions as the foundation for the in- signed in SurveyMonkey and distributed to students in the
class evaluation activity, researchers supplemented each main last quarter of the semester. This twelve-question summa-
Five Ws question with more extensive questions to create tive assessment was intended to determine whether several
an activity appropriate for university students. The “who” student learning outcomes had been met; namely, whether
question, for example, asked students not only to identify students found and used library resources after the library
the author, but also to investigate the author’s credentials, session and whether students recalled and used the Five Ws
including where the author worked, if the author had been method for evaluating an information source for authority,
published more than once, and if the author had research or credibility, and bias. Except for minor clarifications to phras-
work experience that contributed to his or her authority. The ing, the post-session assessment sent to students in the fall
resulting Five Ws activity served as a formative assessment 2012 study was nearly identical to the one distributed during
that measured students’ existing abilities in comprehending the pilot project.
and evaluating documents. Students had the opportunity to The post-session summative assessment was distributed
improve these skills by working through the Five Ws evalu- to students via their respective composition instructors. Dur-
ation method in small groups, with a librarian available to ing the fall 2011 pilot, sixteen composition instructors taught
direct or correct students’ progress. the thirty composition sessions in which the Five Ws activity

volume 53, issue 4 | Summer 2014 337


Feature
was trialed. During the formal study in fall 2012, this number
fell to eleven composition instructors for seventeen sections.
In each iteration of the study, librarians sent course instruc-
tors an email containing an invitation to and directions for
completing the 12-question follow-up survey, which they
were asked to forward to their students. The emailed invita-
tions were sent to instructors an average of three weeks after
the library instruction session. Composition instructors were
also sent at least one email reminder to forward to students
before the last day of classes.
A separate, qualitative survey was distributed to the same
sixteen composition instructors in fall 2011 and eleven com-
position instructors in fall 2012. This twenty-one-question Figure 1. Student Responses to “What is the Document?”
survey was distributed two to five weeks after the library (N = 97)
session and was intended to gather composition instructors’
feedback about the library instruction session. Among other page of the worksheet without providing answers to each
questions, instructors were asked whether or not they found individual question.
the Five Ws evaluation method valuable and if they had used The first criterion, the “what” of the Five Ws, consisted
it or planned to use it in their own classes. The follow-up of questions about the document type and the overall tone
survey sent to instructors in fall 2012 was nearly identical to the author used throughout the document. The vast major-
the fall 2011 pilot with very minor clarifications to wording ity of student groups incorrectly identified the document as
in some questions. a popular article. Less than 10 percent correctly identified
In both semesters, students were offered an incentive for the document as a column (figure 1). When asked about the
participation in the post-session summative assessment. Dur- author’s writing tone (n=96), all but one group agreed that
ing the pilot project, participants were entered into a drawing the tone was conversational rather than technical.
for a single $30 gift certificate to the university bookstore. In Students were next asked to investigate the author of
fall 2012, the incentive was increased and participants were the document (“who”). Student groups agreed that the au-
entered into a drawing for one of four $50 gift certificates to thor had qualifications that made him an authority in 98.9
the university bookstore. Composition instructors received percent of cases (n = 94). In an open-ended question asking
no incentive in either semester. respondents to identify any credentials that contributed to the
author’s authority, the most commonly listed were the author
had earned a law degree, attended Magdalen College/Oxford,
Results was a Rhodes Scholar, had been awarded Pulitzer Prizes, or
had graduated from Harvard University. Two student groups
Responses are summarized below in an order that matches specifically referred to the author’s work as a journalist in Asia
the question order as presented to participants in the assess- as contributing to his authority. Of 94 groups, most reported
ments/surveys, with several responses included in table for- finding information about the author from Wikipedia’s entry
mat. The results refer to responses gathered in the fall 2012 about him (60, or 64.5 percent). Some checked The New
study, with comparisons to the pilot project results provided York Times website for his biography (18, or 19.4 percent),
only at the end of each section. and a relatively small number referred to both websites (5,
or 5.4 percent). The remaining groups claimed to find author
information from Google or from other sources, such as the
Formative Assessment: Five Ws Activity website for the Public Broadcasting System (PBS).
With an average of six small groups per section working to- The “why” criterion was made up of five questions to help
gether to complete the Five Ws activity, an expected number determine the author’s primary purpose for writing, one of
of 102 groups would have submitted online worksheets in which asked students to provide a quote from the document
fall 2012; however, 180 groups started the Five Ws activity. as justification for their choice. Most groups decided that the
Of these, 99 submitted worksheets and are included in this author’s main purpose was to convince readers of something
analysis. The high number of worksheets not submitted is (as befits a column), but one quarter of groups indicated that
likely due to the nature of group activities; researchers ob- the author’s purpose was to inform readers. A majority agreed
served many students reviewing the activity on their own that the author’s point of view was interested and opinionat-
computers to read through the questions and help their ed, and thought that he favored emotional language (table 1).
group finish the worksheet, though only one group member Over 90 percent of groups (91 of 98) correctly identified the
submitted each group’s collective response. The number of author’s main audience as “the general public,” while 7.1 per-
submitted responses includes 44 incomplete responses, in cent thought his main audience was “an educated audience
which students submitted the activity by visiting the last interested in a specific topic (i.e., a marketing professional

338 Reference & User Services Quarterly


Teaching Information Evaluation with the Five Ws

Table 1. Student Responses to Questions in the “Why” Criterion


Incorrect Responses:
Question: What Was the Author’s . . . Correct Responses: An Opinion Piece A Non-Opinion Piece
Main Purpose? (n = 99) Convince Readers: 70 (70.7%) Inform Readers: 25 (25.3%)
Other: 4 (4.0%)
Point of View? (n = 97) Opinionated: 87 (89.7%) Objective: 10 (10.3%)
Language? (n = 98) Emotional: 72 (73.5%) Factual: 26 (26.5%)

addressing others in the marketing field).”


Though the “when” questions were fairly straightfor-
ward—all but 4 of 96 respondent groups correctly identified
the publication date—students consistently demonstrated
difficulty in identifying when the “event or research being
discussed in the document occurred.” Of 95 short answer
responses, fewer than half (43, or 45.3 percent) referred in
some way to the 2011 earthquake, tsunami, or Fukushima
nuclear radiation leaks that were the impetus for the colum-
nist’s writing. The majority of the remaining 52 groups iden-
tified the Japanese earthquakes in 1923 and 1995 to which
the columnist referred but failed to identify a connection to Figure 2. Student Responses to How the Author Gathered Data
more recent natural disasters. (N = 63)
The subsequent “where” criterion focused on the publi-
cation in which the document appeared. Of 95 responding (57.1 percent) inaccurately claimed that the author gathered
groups, all stated that the document was published in The New data from a research study he conducted. Several groups (22,
York Times, except for 2 who referred to the publication as or 34.9 percent) opted to write in additional answers. Of
“The Sunday Opinion” and 6 others who referred to it as the these, one quarter of all respondents (16 of 63), stated that the
“The New York Times Sunday Opinion.” It is unclear if those author gathered data from his personal experience (figure 2).
six understood this was the newspaper’s opinion section, or The final question in the “how” category asked students
if they incorrectly believed it was a publication distinct from to identify the document’s elements or component parts (i.e.,
The New York Times. Of the 94 groups that identified the type how the information was presented). Almost 34 percent of
of publication, 91 groups (96.8 percent) described it as a groups incorrectly stated that the document contained an
“newspaper,” with the remaining groups identifying the pub- abstract and almost 18 percent stated that it contained a
lication as an academic or scholarly journal, a magazine, or a methodology (figure 3). It should be noted that the text of
website. Another question asked students to provide contact this question provided a link to “What is an abstract?” next to
information for the author and/or publication. Most groups the word “abstract,” and “What is a methodology?” next to the
(72 of 79, or 91.1 percent) provided the newspaper’s phone word “methodology.” Both links took students to definitions
number or address, or stated that a message could be sent to of these terms from a website at George Mason University.34
either the author or The New York Times company via email, In the concluding questions of the formative in-class as-
Facebook, Twitter, or GooglePlus. Seven groups (8.9 percent) sessment, students were asked (1) if the document was schol-
were unable to locate any contact information. arly or popular, (2) to list the strengths and weaknesses of the
Of all the Five Ws criteria, the questions relating to “how” document, and (3) whether they would use it as a source in a
Kristof gathered and presented information received the few- college paper. Of 74 groups, 6 stated that the document was
est number of responses. One question asked if and how scholarly (8.1 percent). Justifications for why it was scholarly
the author cited outside sources (the column included one included that it was “written by a graduate of Harvard” or
quote attributed to a Japanese shop owner). Of 82 submitted “written by a Rhodes Scholar,” or because it “uses facts” or
responses, 1 group stated that references were cited through- “has facts in it.” Of these 6 groups, 5 also stated that the ar-
out the document in a scholarly style, 16 that references were ticle was popular (the survey did not limit respondents to one
cited throughout the document in a popular style (19.5 per- answer only). Of the groups who stated it was popular (73,
cent), i.e., there were in-text quotes and attributions but no or 98.6 percent), their justifications included that the docu-
bibliography at the end of the document, and 65 stated that ment was published in a newspaper (38, or 52.1 percent),
references were not listed (79.3 percent). appealed to or was written for the public or used nontechni-
When asked how the author gathered data to reach his cal language/no jargon (29, or 39.7 percent), included or was
conclusions, a question to which multiple answers were per- mostly opinion (17, or 23.3 percent), or that the author did
mitted and 63 groups responded, over half of student groups not cite sources (9, or 12.3 percent). Groups provided one or

volume 53, issue 4 | Summer 2014 339


Feature
• “Yes [because it is] from very credible newspaper and a
well-respected writer.”
• “If I needed the opinion of an American familiar with
Japanese culture and living there I would use Kristof as
a reputable source.”

Of the 55 student groups responding to this question,


37 (67.3 percent) provided what the researchers considered
a reasonable justification for their decision to cite or not cite
the document in a college paper. A total of 27 (49.1 percent)
provided particularly strong or compelling justifications, of
Figure 3. Student Responses to Components of the Document which the four quotations above are indicative.
(N = 56) There was a great degree of similarity between student
responses in both fall 2011 and fall 2012. Comparisons are
more of these explanations in 28.8 percent of cases. provided in table 2, which highlights select questions in each
Student groups listed strengths of the document in a of the Five Ws criteria. Between semesters, one of the biggest
write-in text box (n = 63). Researchers coded responses by as- differences was in responses to how the author presented
signing them to the appropriate Five Ws criteria. Respondents information, including which particular elements the docu-
attributed the document’s strengths to the credentials of the ment contained. This difference may have resulted from the
author (“who,” 35, or 55.6 percent), the positive reputation inclusion of links to definitions of component terminology
of the publication in which it appeared (“where,” 17, or 27.0 (e.g., “What is a methodology?”) in the 2012 assessment,
percent), or that the author included examples from personal which were not included in the 2011 pilot.
experiences (“how,” 14, or 22.2 percent). A total of 27.0
percent of groups provided more than one of these answers.
An additional 17 groups (27.0 percent) provided unclear or Summative Assessment: Follow-up Survey
incomplete responses in describing strengths. After the instruction sessions, a summative assessment mea-
In identifying weaknesses of the document (n = 53), also sured student recall and application of the Five Ws. Though
in a write-in text box, most student groups responded that eleven composition instructors were asked to forward to their
a weakness was in “how” the author gathered his informa- students an invitation to participate in the survey, responses
tion or cited his sources. Student groups wrote that the lack indicate that only nine instructors distributed the invitations
of citations was a weakness (16, or 30.2 percent), the lack to students. Based on this assumption, fifteen sections of
of views other than the author’s was a weakness (5, or 9.4 English 101 and 118, or approximately 345 students, would
percent), or simply wrote that “how” was a weakness with have received an invitation to participate. Of the 55 student
no further explanation (6, or 11.3 percent). Adding these responses received, 53 were usable, making the response rate
responses together, 50.9 percent of student groups identified 15.4 percent when calculated out of fifteen sections (or 13.6
some element of “how” as a weakness of the document. The percent if calculated out of seventeen sections with eleven
bias or opinion in the document was another characteristic instructors).
commonly listed as a weakness (22, or 41.5 percent), which The survey’s twelve questions included several that as-
related to both the “what” criteria (whether the document was sessed student recall of the evaluation method. Among 51
opinion-based or fact-based) and “why” (author’s purpose). respondents, 25 stated that they recalled the method or
One group referred to the source as a weakness because the technique of evaluating sources that was taught in the library
document was not published in a scholarly journal, and three session (49.0 percent). Of these, 3 students identified the
groups (5.7 percent) stated that the “why” was a weakness Five Ws method by name (12.0 percent), 2 indicated using
without providing further explication. A total of 15.1 percent more than one of the Five Ws (e.g., a student wrote that “We
of groups listed more than one of these criteria as weaknesses. looked at the author’s credibility, the style of the article, what
The ultimate question asked groups, “Thinking about the type of article it was, etc.”), and 2 more recalled researching
Five Ws of your source, would you cite this source in a paper? an author to evaluate authority. In total, 7 of the 25 respon-
Why or why not? Might your answer depend on the type of dents who claimed to recall the method were able to recall
paper you’re writing? How so?” Researchers coded responses (in spirit, if not in letter) at least one of the Five Ws criteria
by whether or not the respondents provided a reasonable (28.0 percent).35
justification for their answer. Such rationale included The survey also asked students about their method of
evaluating sources after the library session. Of the 53 re-
• “Yes if the paper was for persuasion. No if it was an in- spondents, 45 stated they had evaluated the credibility and
formative paper.” authority of sources they cited in at least one paper completed
• “Wouldn’t site [sic] it as evidence, but could use it to dem- in the semester (84.9 percent). Of the 44 respondents who
onstrate an opinion.” described their evaluation techniques, nearly three quarters

340 Reference & User Services Quarterly


Teaching Information Evaluation with the Five Ws

Table 2. Select Responses to the Five Ws Criteria: Comparison between Fall 2011 Pilot Project and Fall 2012 Study
Criteria Fall 2011 Pilot Fall 2012
What: Type of Document n = 125 n = 97
Popular Article 64.8% 85.6%
Editorial 29.6% 4.1%
Column* (Correct Answer) 0.8% 9.3%
Why: Author’s Purpose n = 117 n = 99
To Convince (Correct Answer) 57.3% 70.7%
To Inform 35.9% 25.3%
When: Occurrence that Precipitated Publication n = 88 n = 95
2011 Events in Japan (Correct Answer) 40.9% 45.3%
Where: Publication Type n = 116 n = 94
Newspaper (Correct Answer) 94.8% 96.8%
How: Author’s Method of Gathering Data n = 92 n = 63
Author’s Research Study 52.2% 57.1%
Variety of Outside Sources 35.9% 42.9%
Interviewed Similar People 20.7% 27.0%
Interviewed Variety of People 22.8% 25.4%
Personal Experience (Write-In; Correct Answer) 27.2% 25.4%
How: Author’s Presentation of Information** n = 81 n = 56
Abstract 33.3% 33.9%
Bibliography 12.3% 1.8%
Methodology 44.4% 17.9%
Designs/Illustrations/Cartoons 9.9% 5.4%
Eye-Catching Fonts (Correct Answer) 11.1% 50.0%

*The option of “column” was not one of the multiple choice options offered in the pilot assessment.
**Links to definitions for “abstract” and “methodology” were not provided in the pilot assessment. Links to definitions for these words
were included in the fall 2012 assessment.

described evaluating sources using at least one of the Five average of three weeks after the Five Ws library instruction
Ws criteria. Just over 18 percent recalled two or more of the session.
Five Ws (table 3).
After combining and de-duplicating responses to related
questions that asked about recall of the library-taught method Instructor Survey
and the method of evaluation students actually used, a total Eleven instructors were sent a follow-up survey after the li-
of 66.0 percent of all respondents recalled and/or applied at brary session in fall 2012. Six instructors completed the sur-
least one of the Five Ws criteria after the session (table 4). The vey for a response rate of 55 percent. All respondents thought
“who,” or authority criterion, was “stickiest”; those students the Five Ws had value for their students. One instructor re-
who recalled or applied only one of the Five Ws most often ported the Five Ws method to be a “quick, efficient, and easy-
described evaluating the author. Approximately 20 percent to-remember tool to help students evaluate a source.” Another
of students recalled or applied more than one of the Five Ws stated, “I like that it reminded them of ‘the W’s’ they learned
evaluation criteria, with 7.5 percent of all respondents refer- in high school (several, I noticed, expressed recognition),
ring to the Five Ws method by name. while moving them forward into new territory/information.”
The response rate of the fall 2011 pilot summative as- Instructors were also asked if they might use the Five Ws
sessment was too low (5.1 percent) to justify any in-depth method of evaluation in their own instruction. Four of six
comparisons. It may still be of interest to report that responses stated that, at the time of the study, they had already incor-
from the pilot study were similar to those from fall 2012. Of porated some form of the Five Ws method into their teach-
the fifteen completed surveys, nine students (60.0 percent) ing (table 5). Five reported that they intended to utilize the
recalled and/or applied at least one of the Five Ws criteria an method in the future, and one respondent was unsure about

volume 53, issue 4 | Summer 2014 341


Feature

Table 3. Techniques Students Used to Evaluate Sources: Table 4. Combined Responses, Recall, and/or Application of the
Application of the Five Ws Five Ws Evaluation Method
Evaluation Method Respondents (N = 44) Evaluation Method Respondents (N = 53)
The Five Ws Exactly 2 (4.5%) The Five Ws Exactly 4 (7.5%)
Author (Who) Only 21 (47.7%) Author (Who) Only 21 (39.6%)
Publication (Where) Only 2 (4.5%) Publication (Where) Only 2 (3.8%)
Author’s Purpose (Why) Only 1 (2.2%) Author’s Purpose (Why) Only 1 (1.9%)
2–4 Ws 6 (13.6%) 2–4 Ws 7 (13.2%)
At Least 1 W 32 (72.7%) At Least 1 W 35 (66.0%)

future use. When asked how they might include the method evaluation, and 46.7 percent of students enrolled in sections
in their classes in the future, one instructor wrote that they in which the Five Ws were not used outside of the library
would repeat the activity in another class meeting but may class were able to do so.
also consider adding it as a homework assignment. Another
wrote, “I have already been using it in 102, but will begin
stressing it in 101 as soon as we begin talking about research Discussion
for the source-based paper.” These instructors’ responses were
echoes of the positive responses reported in the fall 2011 pi- In assigning the initial in-class, formative assessment the re-
lot project, in which six out of six instructors reported that searchers had three intended goals: (1) to introduce students
the Five Ws was valuable for their students, and four of six to a systematic information evaluation method that would
were considering using the method in their own instruction. serve as an instructional scaffold to develop evaluation skills,
Notably, students who identified being enrolled in a (2) to measure how many students could accurately charac-
course in which their instructor had used the Five Ws per- terize features of a given source (for example, determining
formed better in recalling and/or applying the Five Ws than that a given source was opinionated, popular, and written
those students in a course in which the instructor did not by a credible author), and (3) to examine if students would
use the Five Ws outside of the library session, or in a course would be able to present a reasonable argument about why
in which the instructors’ use of the Five Ws was unknown.36 they would or would not cite an opinionated, popular source
In sections in which course instructors were known to have in a college paper, and if they would use criteria from the
used the Five Ws, over half of students self-reported that they library method in their rationales.
recalled the evaluation method taught in the library class On the first point, the use of the Five Ws as an instruc-
(19, or 52.8 percent of 36 respondents). In sections in which tional scaffold was successful. Students asked very few ques-
the Five Ws were not referred to during regular class times, tions about the Five Ws method or how to use it. While no
40.0 percent of students reported recalling the method (6 of formal assessment measured student familiarity with the Five
15 respondents). When asked to explain this library-taught Ws before the library session, more than three quarters of stu-
method, 31.6 percent of students recalled at least one of the dents in each section confirmed by vocal agreement, a head
Five Ws criteria when they were in a section in which the nod, or raised hand that they had heard of the Five Ws before
instructor used the Five Ws, as opposed to 16.7 percent of the library session. Because very few students had questions
those enrolled in sections in which the instructor did not/was about the evaluation method itself, the scaffold was helpful
not known to reinforce the Five Ws (table 6). in using class time efficiently. Most student groups (82, or
Additionally, when students were asked if they had 82.8 percent) completed at least three-quarters of the activ-
evaluated sources that semester, 84.2 percent of students in ity during class time, and 55 out of 99 student groups (55.6
sections that used the Five Ws outside of the library session percent) completed the entire in-class activity.
stated that they evaluated their sources (32 of 38). Similarly, The effectiveness of the Five Ws as a scaffold was also
80.0 percent of students in sections who did not use the Five supported by the summative assessment results. Students in
Ws outside of the library session stated that they evaluated sections where the Five Ws method was reiterated after the
their sources (12 of 15). Yet, when asked how they evaluated library session were better at recalling and applying the evalu-
sources, 78.1 percent of students in courses in which the Five ation method than those exposed to the Five Ws only once
Ws were used outside of the library session applied at least (65.8% versus 46.5%). Scaffolds are tools put in place tempo-
one of the Five Ws, while 58.3 percent of students in which rarily to help students master a skill, and learners may need
the Five Ws were not used outside of the library session did to use a scaffold for some time before they develop or inter-
the same (table 7). After combining both recall and applica- nalize the steps involved in a particular skill. Those students
tion responses, 65.8 percent of those with repeated exposure who used the Five Ws method in a class setting more than
to the method recalled and/or applied aspects of the Five Ws once were able to apply the skills of source evaluation more

342 Reference & User Services Quarterly


Teaching Information Evaluation with the Five Ws

despite the author’s identification as a columnist in his bio-


Table 5. Instructors’ Use of the Five Ws: How They Used It
Outside of the Library Session
graphical statements and student descriptions of the author’s
opinionated perspective. Furthermore, in the library session
“Yes, in our next class meeting after the session we reviewed the wrap-up when groups presented their Five Ws answers to
five Ws as a tool for source evaluation.” the class, most students were unable to describe to the re-
“I have referenced it in class discussion and particularly in one- searchers the differences between a column and an article, or
on-one conferences.” between a columnist and a journalist. Although this provided
“Modified: I had my student evaluate sources by doing in-class
the researchers with a built-in teachable moment during the
research on a few of the W’s, like ‘who,’ ‘what,’ and ‘where,’ session, it also indicates that, though students are capable of
though I didn’t call it ‘the three Ws, or anything.” identifying an opinion piece when they read it, they do not
recognize terminology typically associated with such pieces.
“I do, but not as overtly. I incorporate it into our discussion
about the readings as we go—the types of questions I ask them
Additionally, student responses demonstrate a lack of un-
are shaped by the five Ws.” derstanding of elements included in scholarly publications,
as well as ignorance of the jargon used to explain scholarly
authors’ research processes. The researchers were, frankly,
consistently than students who used the scaffold only once in surprised at the number of student groups claiming that the
a library instruction setting, suggesting that the former group newspaper column included an abstract and a methodology.
had made more progress in internalizing these skills. In the pilot project, 44.4 percent of student groups claimed
Regarding source evaluation, students were, overall, suc- the column included a methodology and 33.3 percent of stu-
cessful in determining the opinionated and popular nature dent groups claimed it included an abstract. After the pilot
of the source—nearly 90 percent of student groups described project, the researchers added links to definitions of “abstract”
the piece as being written from an opinionated point of view and “methodology” in the assessment, and this reduced the
and 70.7 percent of student groups stated that the author’s number of groups claiming the column included a methodol-
main purpose in writing was to persuade readers to agree ogy (17.9 percent) but had little to no impact on the number
with his opinion (table 1). Students accurately determined of student groups claiming that an abstract was presented
that the author had multiple degrees, first-hand knowledge (33.9 percent). Students were also asked about how the au-
of the topic, and was highly regarded by other journalists and thor gathered data for his argument. Most students claimed
authors. In fact, this particular author’s credentials made such that the author conducted a research study (52.2 percent in
an impression on students that, if a respondent recalled only the fall 2011 pilot, 57.1 percent in fall 2012). Approximately
one of the Five Ws on the follow-up summative assessment, one quarter of students in either semester accurately stated
it was most often the importance of evaluating the author (the that the author gathered data from personal experience.
“who” criterion) of a document. In addition, the vast majority While the responses to these “how” questions had the
of student groups correctly identified the document as having fewest number of respondents (probably because time con-
been written for the general public (92.9 percent) and pub- straints kept some students from reaching these penultimate
lished in a popular newspaper (96.8 percent). questions), the findings are notable because they indicate
Most students performed well in evaluating the “who” (au- that many students are unfamiliar with scholarly article
thority of the author), “where” (credibility of the publication), components. Although the time-constrained library session
and “why” (author’s purpose) characteristics of the document. did not include specific instruction on defining or identify-
Because of the emphasis on rhetorical analysis in composition ing the parts of scholarly documents (including abstracts
courses, some students may have been attuned to analyzing and methodologies), students were asked whether the au-
the tone, language, and purpose of an author. As a result, the thor included such components. They were also directed to
number of correct responses to questions about authorial tone ask the library instructor if they had any questions, and the
and intent is, perhaps, unsurprising. However, the remain- researchers were readily available to provide any necessary
ing Five Ws and one H—the “what” (document type), “how” assistance. Additionally, students had access to both Internet
(gathering and presentation of data), and “when” (recency/ search engines and links to definitions of these terms. Yet, for
currency/timely impetus for publication)—were criteria with the most part, students did not ask for help defining these
which many, if not most, students struggled. terms. This is further evidence that students can identify a
Of these three criteria, the formative assessment results popular piece in aggregate, but they are largely unaware of
associated with the “what” and “how” criteria provide in- the defining characteristics and categories of publications that
sights into what students do not know. These gaps in student help knowledgeable readers distinguish between document
knowledge might be classified, in general, as a lack of aware- types and authorial processes.
ness of publication jargon and processes. In particular, this These findings point to an illiteracy that is important to
manifested in students’ inability to recognize either types of address. Lower division undergraduate students can clearly
documents or types of authors. For instance, the majority of distinguish between a scholarly document and a popular
student groups (85.6 percent) claimed that the document was one with little instruction. What students are missing, how-
a popular article and not a column. This mistake persisted ever, is an awareness of distinctions among the processes by

volume 53, issue 4 | Summer 2014 343


Feature

Table 6. Comparison of Student Recall of the Five Ws


Enrolled in Sections in which
Instructors Did Not Use the Five
Enrolled in Sections in which Ws Outside of the Library Session,
Students Who Recalled Learning to Instructors Used the Five Ws Outside of or Instructors’ Use of the Five Ws is
Evaluate . . . the Library Session (n = 19) Unknown (n = 6)
Five Ws Exactly 2 (10.5% ) 1 (16.7% )
Author (Who) Only 2 (10.5%) 0 (0.0%)
2–4 Ws 2 (10.5% ) 0 (0.0%)
At Least 1 W 6 (31.6%) 1 (16.7% )

Table 7. Comparison of Student Application of the Five Ws

Enrolled in Sections in which


Instructors Did Not Use the Five Ws
Enrolled in Sections in which Outside of the Library Session, or
Students Who Explained Evaluating Instructors Reiterated the Five Ws Instructors’ Use of the Five Ws Was
their Sources by Using . . . Outside of the Library Session (n = 32) Unknown (n = 12)
The Five Ws Exactly 0 (0.0%) 2 (16.7%)
Author (Who) Only 19 (59.4%) 2 (16.7%)
Publication (Where) Only 1 (3.1%) 1 (8.3%)
Author’s Purpose (Why) Only 0 (0.0%) 1 (8.3%)
2–4 Ws 5 (15.6%) 1 (8.3%)
At Least 1 W 25 (78.1%) 7 (58.3%)

which columnists, journalists, and researchers arrive at their not explicitly state that his writing was prompted by those
conclusions and an ability to correctly classify or label opin- disasters. At the time of publication, most readers would have
ion pieces from factual ones. This is of particular concern in been bombarded with media reports covering those terrible
terms of scholarly publications, such as The Lancet, in which events, but these students were evaluating the document
letters to the editor often include citations and refer to the eighteen months after the events (six months after for the
letter writers’ employment at universities or other research pilot group), and they approached the “when” at face value,
institutes. To a new student, such a letter could easily look providing only the dates of earthquakes that were specifically
like a scholarly research article as opposed to criticism of referenced in the column (1995 and 1923). Less than half of
another researcher’s study. Without knowledge of publishing student groups approached the question from the angle of a
jargon and processes, students may find criticism and opinion past current event; only 45.3 percent made a connection be-
pieces, such as book reviews and letters to the editor, indis- tween the March 2011 disasters in Japan and the March 2011
tinguishable from their research-based counterparts in a list column. The value of situating a publication in its appropri-
of database search results. These critically important abilities ate context was a discussion point at the end of the library
were underdeveloped in these first-year students who were session, after students had submitted their responses via Sur-
at the end of their first semester at the university, and these veyMonkey and presented their group’s findings to the class.
findings were consistent over a two-semester period. The third and final purpose of the formative assessment
The other challenging criterion, the “when” questions, was to examine student arguments for why they would, or
proved difficult to students for two reasons. First, one “when” would not, cite the opinionated, popular source in a college
question asked students whether they needed to cite some- paper. Following examination of the document using the six
thing recently published for their assignment or if a historical criteria, the activity concluded by asking students to articu-
piece was suitable for their topic. Because students were not late their overall impressions of the document, both verbally
reviewing this document in connection with a particular re- and in the written assessment. These reflections were valu-
search assignment, the question was irrelevant and confusing able for both students and researchers in that they not only
in this context. Second, and more significant, were student prompted students to consider the document holistically
difficulties regarding when the events discussed in the docu- and for a definitive purpose, but also provided researchers
ment occurred. Though published in March 2011, the same a glimpse into students’ decision-making processes. For ex-
month in which the Tōhoku earthquake and tsunami and ample, several groups thought that the author was a scholar,
Fukushima nuclear disaster struck Japan, the columnist did but because he was published in a newspaper, addressed a

344 Reference & User Services Quarterly


Teaching Information Evaluation with the Five Ws

popular audience, and provided his opinion, the document instructor requested a copy of the Five Ws activity to use in
was more of a popular source than a scholarly one. Thus, her class, with several additional instructors indicating their
most groups (98.6 percent) capably weighed multiple criteria intentions to incorporate aspects of the activity in future
to accurately describe the popular nature of the document. classes. Due to the demonstrated increase of memorability
This holistic processing was again demonstrated in the and application among students who received additional in-
final question, in which students were asked to state de- struction on the method outside of the library, the research-
finitively whether or not they would cite the document in ers plan to utilize this collaborative scaffold approach as a
a college paper. The researchers deliberately left this as a model for other library instruction. Activities based on this
“judgment call” to see if responses would speak to their abili- model will be similar to the formative Five Ws assessment
ties as first-year students to navigate the complexities of the in their value to instructors and students, repeatability in a
evaluation process. There was no right or wrong answer for later nonlibrary class session, and ease of incorporation into
whether the source was worthy of citation in a college paper, an existing assignment.
and some students may have been influenced by the fact that
citing popular sources was permitted in their composition
assignment. Just over 67 percent of respondents provided a Conclusion
reasonable explanation for their decisions, often referring to
the author’s credentials or the expression of opinion in the To effectively prepare students for a lifetime of learning, it
document as reasons for why they would, or would not, cite is essential that information literacy instruction sessions de-
the material. Nearly half of all respondents provided a com- velop skills, such as source evaluation, that transfer beyond
paratively well-synthesized or nuanced justification. classroom walls. Although navigating the complexities of the
As a result, the Five Ws can be considered an effective modern search experience in a one-shot session can be diffi-
instructional scaffold and evaluation method. Results indicate cult, learners are increasingly likely to encounter information
that students were familiar with the Five Ws before attending sources, such as online journals, that defy traditional relation-
library instruction, were able to apply it successfully during ships between content and means of access; gone are the days
class, and that instructors unanimously found value in the when scholarly output was more likely to be found in print
method. However, one limitation of this study is its lack of sources than online. As a result, using an evaluation method
comparisons among different types of evaluation methods. that works well with any source, regardless of means of access
At this point, researchers are unable to determine whether and retrieval, is vital. The information evaluation methods
the Five Ws is more or less effective or memorable than published in the library literature over the past decade tend
alternative methods, such as CRITIC. It is also unknown if to privilege distinctions in access—online versus print—over
composition instructors would have preferred or valued a dif- distinctions in document types (e.g., articles versus editori-
ferent evaluation method over the Five Ws, though it should als). In doing so, these methods fail to emphasize the unique
be noted that no instructor offered an alternative approach. characteristics that well-informed readers use to distinguish
Regarding recall, the memorability and application of the between information sources, such as the inclusion of a re-
Five Ws method was less successful than researchers origi- search methodology or the author’s affiliation.
nally hoped. Summative assessment results demonstrate that In this study, the Five Ws was introduced as a means of
few students (7.5 percent) recalled the Five Ws evaluation evaluating sources found regardless of format or mode of ac-
method by name, indicating that the method is not overly cess. The researchers were primarily concerned with testing
“catchy” as a mnemonic device. In describing their own the memorability of the Five Ws evaluation method, instruc-
evaluation processes, however, most student responses (66.0 tors’ perceived value of that method, and its effectiveness as a
percent) suggest an internalization of some aspects of the Five scaffold. Although less than 10% of students recalled the Five
Ws activity. After the library session, more than half of stu- Ws in its entirety, a majority used salient evaluation points
dents (60.3 percent) reported researching the backgrounds of from the method in their own research later in the semester.
the authors they cited in papers. The 13.2 percent of students In addition, most instructors found value in the method and,
who considered more than one aspect of a source (but not all in those classes where instructors reiterated the method out-
Five Ws) most often evaluated the reputation and reliability side of the library session, student retention and use of the
of both the author (who) and the publication (where). Con- Five Ws increased. These results suggest that instructional
sequently, while students may not have replicated the method scaffolding is an effective way to overcome some of the many
in its entirety, many applied aspects of the Five Ws and un- limitations of one-shot library instruction—including time
derstood it as part of an evaluation process. restrictions and an of abundance of learning outcomes to
The findings of the summative assessment also point address—by integrating library instruction into course-level
to the value of collaborating with course instructors. The instruction through an information literacy activity based on
impact of library instruction beyond the one-shot session a concept familiar to students, and easily incorporated into
was enhanced by creating an assessment/activity that served course instruction and assignments. Though these results
as a skills development scaffold needed for established as- are promising, still more research is needed in the applica-
signments. At the conclusion of the study, one composition tion of instructional scaffolding to library instruction and in

volume 53, issue 4 | Summer 2014 345


Feature
collaborating with academic departments to teach the type References and Notes
of information evaluation students most need in the current 1. J. Patrick Biddix, Chung Joo Chung, and Han Woo Park, “Conve-
information environment. nience or Credibility? A Study of College Student Online Research
Behaviors,” Internet & Higher Education 14, no. 3 (2011): 175–82;
Although this study was primarily concerned with testing
Lea Currie et al., “Undergraduate Search Strategies and Evaluation
the memorability and value of the Five Ws, several unantici- Criteria: Searching for Credible Sources,” New Library World 111,
pated findings were discovered via the formative assessment. no. 3/4 (2010): 113–24.
These findings include that the majority of students lacked 2. ACRL Association of College and Research Libraries, Association
the background knowledge necessary to differentiate among of College & Research Libraries Information Literacy Competency
Standards for Higher Education (Chicago: ACRL, 2001), accessed
the information gathering techniques of various types of
July 18, 2013, www.ala.org/acrl/sites/ala.org.acrl/files/content/
authors (e.g., journalists versus researchers). Additionally, standards/standards.pdf.
students’ difficulties in explaining differences between schol- 3. Ibid.
arly articles, popular articles, and columns may have resulted 4. Mikael Laakso et al., “The Development of Open Access Jour-
from their lack of familiarity with the jargon and function of nal Publishing from 1993–2009,” PLoS ONE 6, no. 6 (June
13, 2011), accessed July 24, 2013, www.plosone.org/article/
publication components, such as abstracts. As a formative
info:doi/10.1371/journal.pone.0020961.
assessment, the Five Ws activity did not address such gaps 5. Stuart Hampton-Reeves et al., Students’ Use of Research Content
in knowledge, but instead identified their existence among in Teaching and Learning, Report for the Joint Information Systems
students who were close to completing their first semester Council (University of Central Lancashire: Center for Research-
at a university. Informed Teaching, 2009), accessed July 15, 2013, www.jisc
.ac.uk/media/documents/aboutus/workinggroups/studentsusere
For many lower division undergraduate students, gen-
searchcontent.pdf.
eral education courses punctuate the first two years of their 6. Ibid., 26.
college careers. Introducing evaluation and research skills in 7. Ibid., I, 47.
these general, interdisciplinary, required courses may help 8. Biddix, Chung, and Park, “Convenience or Credibility?” 180.
to equip students with the critical thinking skills needed to 9. Sarah Blakeslee, “The CRAAP Test,” LOEX Quarterly 31, no. 3
(2004): 6, accessed July 24, 2013, http://commons.emich.edu/
succeed in advanced and specialized courses. Undergraduates
cgi/viewcontent.cgi?article=1009&context=loexquarterly.
who are able to acquire and internalize skills for evaluating 10. Meriam Library, California State University, Chico, “Evaluat-
information at both source and document level may be more ing Information—Applying the CRAAP Test,” September 17,
prepared for upper division courses in which evaluation 2010, accessed July 18, 2013, www.csuchico.edu/lins/handouts/
becomes deeper, involving the comparing and contrasting eval_websites.pdf; Andrew B. Pachtman, “Developing Critical
Thinking for the Internet,” Research & Teaching in Developmental
of methodologies and scholarly findings within a particular
Education 29, no. 1 (2012): 39–47.
field. The findings from this study suggest that there is a need 11. Brad Matthies and Jonathan Helmke, “Using the CRITIC Acro-
for increased attention to developing these skills in general nym to Teach Information Evaluation,” in Library Instruction:
education courses. In particular, there is a need for ensur- Restating the Need, Refocusing the Response: Papers and Session
ing that students look at documents not only from a narrow, Materials Presented at the Thirty-Second National LOEX Library
Instruction Conference held in Ypsilanti, Michigan 6 to 8 May
disciplinary view, but also contextually, in an attempt to un-
2004, ed. D. B. Thomas, Randal Baier, Eric Owen, and Theresa
derstand the greater forces at play in their creation. There is Valko, 65–70 (Ann Arbor, MI: Pierian Press, 2005), accessed July
life-long value in ensuring that students not only summarize 25, 2013, http://works.bepress.com/brad_matthies/28.
competently, but also analyze competently; that they not only 12. Wayne R. Bartz, “Teaching Skepticism via the CRITIC Acronym
understand what they read, but also recognize that there are and the Skeptical Inquirer,” Skeptical Inquirer 26 (September
2002): 42–44.
people and processes involved in creating what they read. If
13. Sara Seely, Sara Fry, and Margie Ruppel, “Information Literacy
that is indeed the case, there may also be value in assessing Follow-Through: Enhancing Pre-Service Teachers’ Information
students’ knowledge of publication processes prior to entry Evaluation Skills Through Formative Assessment,” Behavioral &
into upper division courses. Social Sciences Librarian 30, no. 2 (2012): 72–84.
Despite the low recall of the Five Ws in its entirety or 14. Ibid., 78.
15. Ibid., 83.
by name, the overall effectiveness of the method has led
16. Marc Meola, “Chucking the Checklist: A Contextual Approach to
researchers to continue to use, revise, and improve the Five Teaching Undergraduates Web-Site Evaluation,” portal: Libraries
Ws formative assessment for English 101/118 instruction ses- and the Academy 4, no. 3 (2004): 331–44.
sions. Researchers have also made adjustments to the activity 17. Ibid., 336.
for use with high school library instruction sessions and have 18. Ibid., 337.
19. Melissa Bowles-Terry, Erin Davis, and Wendy Holliday, “‘Writing
forthcoming plans to adapt the method for use in an online
Information Literacy’ Revisited: Application of Theory to Practice
tutorial. In the near future, the researchers plan to share re- in the Classroom,” Reference & User Services Quarterly 49, no. 3
sults of this study with the First-Year Composition depart- (2010): 225–30.
ment in an effort to support lower division undergraduate 20. Ibid., 229.
student learning. In the long term, the researchers hope that 21. Ibid., 230.
22. Ibid., 226.
these findings will encourage more studies on information
23. Lev Semyonovich Vygotsky, “Interaction Between Learning and
evaluation instruction and the role it might play in the devel- Development,” in Mind and Society: The Development of Higher
opment of information literate citizens and scholars. Psychological Process, ed. Michael Cole, Vera John-Steiner, Sylvia

346 Reference & User Services Quarterly


Teaching Information Evaluation with the Five Ws

Scribner, and Ellen Souberman (Cambridge, MA: Harvard Uni- their assumptions about scholarly versus popular authors.
versity Press, 1978), 79–91. 32. Though the exact number of student participants is unknown,
24. Ibid. the pilot group consisted of 30 first-year composition sections,
25. David Wood, Jerome S. Bruner, and Gail Ross, “The Role of Tutor- including eight English 118 sections and 22 sections of English
ing in Problem Solving,” Journal of Child Psychology & Psychiatry 101. In fall 2011, each English 101 section was capped at 23 stu-
17 (1974): 89–100. dents and each English 118 was capped at 22 students.
26. Jerome S. Bruner, “The Ontogenesis of Speech Acts,” Journal of 33. In 2012, both English 101 and English 118 sections were capped
Child Language 2, no. 1 (1975): 1–19. at 23 students, and researchers taught 17 101/118 sections in
27. Vygotsky, “Interaction Between Learning and Development”; which the Five Ws learning activity was used.
Wood, Bruner, and Ross, “The Role of Tutoring in Problem Solv- 34. Jennifer Morse et al., “A Guide to Writing in the Biological Sci-
ing.” ences: The Scientific Paper: Abstract,” George Mason University
28. Bruner, “The Ontogenesis of Speech Acts”; Wood, Bruner, and Department of Biology, accessed July 29, 2013, http://wac.gmu
Ross, “The Role of Tutoring in Problem Solving.” .edu/supporting/guides/BIOL/Abstract.htm; Jennifer Morse et
29. Bartz, “Teaching Skepticism via the CRITIC Acronym and the al., “A Guide to Writing in the Sciences: The Scientific Paper:
Skeptical Inquirer.” Methods,” George Mason University Department of Biology,
30. Nicholas Kristof, “The Japanese Could Teach Us a Thing or Two,” accessed July 29, 2013, http://wac.gmu.edu/supporting/guides/
New York Times, March 19, 2011, accessed July 29, 2013, www BIOL/Methods.htm.
.nytimes.com/2011/03/20/opinion/20kristof.html. 35. It should be noted that even if a student did not use a Five Ws
31. During the pilot study, students attempted the in-class Five Ws term to describe their evaluation method (e.g., a student did
activity with one of three separate documents: a report from the not say “I evaluated ‘who’ wrote the document”), as long as a
World Health Organization (WHO), a scholarly article from a student’s comments and explanations clearly referred to a Five
geography journal, and the aforementioned newspaper page Ws criterion, the comment was coded for the corresponding
that included Kristof’s column. These documents were assigned criterion. For example, one student’s response to how he or she
randomly to groups and provided researchers an opportunity to evaluated a source was, “I researched their degree level, literary
observe student experiences evaluating different document types. accomplishments, and involvement in the field I was writing in.”
Most students were able to identify the scholarly article right This response was coded as an application of the “who,” or author
away. The unambiguous nature of the document presented little criterion.
challenge in terms of conducting a nuanced evaluation and, as 36. The four instructors who used the Five Ws in some way in their
such, was of minimal value to students. The WHO report led to own instruction outside of the library session taught eight of the
some confusion and difficulty (e.g., finding information about the fifteen sections whose students participated in the student survey.
authors of the report) and first-year composition students who Two instructors who participated in the instructors’ survey did
became “stuck” on a question were unable to complete the assess- not use the Five Ws in their own instruction, and taught three
ment in the time allotted. The column, on the other hand, pre- sections of 101/118. Three instructors did not participate in the
sented an appropriate balance of difficulty and accessibility. The follow-up survey, but their students participated in the student
material was familiar in that most students easily identified the survey. These three instructors taught four sections of 101/118,
New York Times as a newspaper, but Kristof was unfamiliar to most and their use of the Five Ws outside of the library session remains
of them, and his academic achievements helped students question unknown.

volume 53, issue 4 | Summer 2014 347

You might also like