Artigo Sobre Clickers

Download as pdf or txt
Download as pdf or txt
You are on page 1of 4

G O O D

I D E A S

Clickers in the Classroom:


An Active Learning Approach
Further research will determine whether clickers complement or surpass
other active learning approaches in improving learning outcomes
By Margie Martyn

urrent research describes the


benefits of active learning
approaches. Clickers, or student response systems, are a technology
used to promote active learning. Most
research on the benefits of using clickers
in the classroom has shown that students become engaged and enjoy using
them. However, research on learning
outcomes has only compared the use of
clickers to traditional lecture methods.
Although learning outcomes are higher
when using clickers, the question is
whether the clickers or the active learning pedagogies are the cause. For this
reason, I conducted a study that compared learning outcomes resulting from
the use of clickers versus another active
learning methodclass discussion. Even
though both techniques employ active
learning, would using clickers increase
learning outcomes more than another
active learning approach? Two key features distinguish clicker use:
Clickers provide a mechanism for
students to participate anonymously.
Clickers integrate a game approach
that may engage students more than
traditional class discussion.
The study also investigated students
perceptions of their learning using clickers versus classroom discussion.

Active Learning
The benefits of active learning are
widely acclaimed in higher education.
According to Guthrie and Carlin,1
modern students are primarily active
learners, and lecture courses may be
Number 2 2007

E D U C A U S E Q U A R T E R LY

71

increasingly out of touch with how students engage their world. Chickering
and Gamson,2 early proponents of active
learning, designated encourage active
learning as one of seven principles of
good practice in higher education.
A relatively new technology, clickers
offer one approach to employing active
learning in the classroom. They are more
formally denoted as student response
systems (SRS), audience response systems (ARS), or personal response systems (PRS).3
Johnson 4 described how clickers
address three of Chickering and Gamsons seven principles for good practice
in undergraduate education. Clickers
help instructors
actively engage students during the
entire class period,
gauge their level of understanding of
the material being presented, and
provide prompt feedback to student
questions.
Beatty explained why clickers help
students actively engage in the learning
process. He wrote that this engagement
helps students
develop a more solid, integrated,
useful understanding of concepts
and their interrelationships and
applicability. A concerted focus on
understanding rather than recall,
and on reasoning rather than
answers, bolsters the effect.5
With clickers, students have an input
device that lets them express their views
in complete anonymity, and the cumulative view of the class appears on a
public screen. Each input device is numbered, however, so the instructor can
download responses for recordkeeping
after the class session ends.
Although these systems are becoming increasingly popular in higher
education, most research has targeted
their affective benefits, which include
greater student engagement, increased
student interest, and heightened discussion and interactivity. According to
West,6 however, past studies on learning
outcomes suggest that better learning
outcomes result from changes in pedagogical focusfrom passive to active
learningand not from use of a specific
technology or technique.

72

E D U C A U S E Q U A R T E R LY

Number 2 2007

The anonymity of
responding with a clicker
guarantees near or total
participation
Clickers can provide added value,
however, when compared to some active
learning methods such as class discussion. In a normal class discussion situation, only one or two students have the
opportunity to answer a question. Even
if the answer is correct, the instructor
has no way to gauge if the other students
knew the correct answer. A student who
is unsure of the correct answer may be
unwilling to take the public risk of being
incorrect. One of the best features of an
SRS is that it allows students to provide
input without fear of public humiliation and without having to worry about
more vocal students dominating the
discussion. Even in small-enrollment
classes, many students are reluctant to
respond to faculty questions; the anonymity of responding with a clicker
guarantees near or total participation.
Johnson described this benefit:
First, many students are hesitant
to respond to an answer until they
know how others will respond. We
have all observed students glancing
around the room when a question
is asked, gauging the number of
hands that have been raised until
a safe number are in the air for
them to add their own. Therefore,
the anonymity that an electronic

system provides allows students to


respond in a safe manner, which
encourages them to take risks with
their responses. Second, it is difficult,
if not impossible to ask multi-answer
questions with a simple show of
hands. You can imagine yourself
saying Okay, put up your right
hand for A, left hand for B, both
hands for C, and stand up for D.7
Another benefit of clickers over traditional active learning methods is that
they follow the principles of game-based
learning. Students of the twenty-first
century have grown up using computer
games for learning and entertainment.
My study isolated the effects of clickers by comparing two classes that used
clickers (n = 45) with two classes that used
class discussion (n = 47). Although both
methods involved active learning, clickers had the additional benefits described
above. The study investigated whether
these additional benefits resulted in
higher learning outcomes.
Learning outcomes were measured by
taking the score on the comprehensive
final exam at the end of the semester.
In addition, a pretest was given to determine if any statistically significant differences existed between the groups at
the beginning of the study.
The study also compared student
perceptions about their learning after
using one of the two active learning
techniques: clickers or class discussion. All four classes were taught in the
same semester (fall 2006), had the same
instructor, and used the same textbooks,
learning materials, and assessments.
The study took place at a small,
liberal arts college in the Midwestern
United States. Enrollment at the college
includes approximately 2,900 full-time
undergraduate day students, 800 evening and weekend adult nontraditional
learners, and 600 graduate students from
across the United States and more than
20 foreign countries.
Participants in this study included 92
students in four sections of an introductory computer information systems
class. The course involved general computer literacy and appealed to a wide
range of majors. Two sections used clickers, and two sections used class discus-

Table 1

Table 2

Study Participants
Used clickers

Class 1
n = 22

Class 2
n = 23

Total Using Clickers


n = 45

Used class discussion

Class 3
n = 24

Class 4
n = 23

Total Using Discussion


n = 47

sion. The majority of participants were


traditional-aged learners (1822 years
old) who attended the institution on a
full-time basis. See Table 1.
During the first class session, I
administered the pretest. On the last
day of class, I administered the final
exam and the perception survey to the
students.
As Table 2 illustrates, the pretest
scores for all the groups had similar
means of approximately 50 percent.
When calculating an analysis of variance between the pretest scores of those
students using clickers and those who did
not, no statistically significant difference
occurred in the pretest scores between the
two groups F(1, 90) = 1.647, p < .203).
All the classes met twice a week
for 75 minutes for the course lecture
and question sessions. Turning Point
Technologies software was used to
collect clicker responses. The same
PowerPoint presentation, including
the same questions, was used for the
sections participating in class discussion. The only difference was that the
students using class discussion needed
to raise their hands to respond, so
their responses were not anonymous.
Clickers were implemented based on
recommendations for best practices
from Robertson, Duncan, and Turning
Point Technologies8 (see the sidebar).

Evaluation
Evaluation of the study results focused
on student learning outcomes and students perceptions of them.

Student Learning Outcomes


The mean for the group using clickers
was 85.80 (SD = 8.98). For the group
using discussion, the mean was 87.19
(SD = 7.58). When performing an analysis of variance between the posttest

scores of students using clickers and


those who had not, there was no statistically significant difference: F(1, 90)
= .634, p <.428.

Pretest Score
Average
Pretest
Score

Group
Used Clickers
(n = 45)

49.18

Used Class Discussion


(n = 47)

51.72

Best Practices for Implementing


Clickers in the Classroom*
1. Keep slides short to optimize legibility.
2. Keep the number of answer options to five.
3. Do not make the questions overly complex.
4. Keep voting straightforwardsystems allow complex branching, but keep it
simple.
5. Allow sufficient time for students to answer questions. Some general guidelines:

Classes of fewer than 30 students: 1520 seconds per question

Classes of 30 to 100 students: 30 seconds per question

Classes of more than 100 students: 1 minute per question

6. Allow time for discussion between questions.


7. Encourage active discussion with the audience.
8. Do not ask too many questions; use them for the key points.
9. Position the questions at periodic intervals throughout the presentation.
10. Include an answer now prompt to differentiate between lecture slides and
interactive polling slides.
11. Use a correct answer indicator to visually identify the appropriate answer.
12. Include a response grid so that students know their responses have
registered.
13. Increase responsiveness by using a countdown timer that will close polling
after a set amount of time.
14. Test the system in the proposed location to identify technical issues (lighting,
signal interference, etc.)
15. On the actual day of the session, allow time to set out clickers and start
system.
16. Rehearse actual presentation to make sure it will run smoothly.
17. Provide clear instructions on how to use the clickers to the audience.
18. Do not overuse the system or it will lose its engagement potential.
*Tips 15, 1416, and 18 came from Robertson; tips 69 and 17 from Duncan; and tips 1013 from
Turning Point Technologies.

Number 2 2007

E D U C A U S E Q U A R T E R LY

73

Table 3

Perception Survey Results*


Survey Question

Used Clicker
(n = 45)
Mean

Used Class
Discussion
(n = 47) Mean

Participation with clickers (or class discussion)


improved my grade in the course.

3.60

3.20

Participation with clickers (or class discussion)


improved my understanding of the subject
content.

4.03

3.61

Participation with clickers (or class discussion)


increased my feeling of belonging in this
course.

3.78

3.48

Participation with clickers (or class discussion)


increased my interaction with the instructor.

4.15

3.62

Participation with clickers (or class discussion)


increased my interaction with other students.

3.45

3.17

I enjoyed participation with clickers (or class


discussion).

4.14

3.93

I would recommend using clickers (or class


discussion) again in this course.

4.12

4.05

*Strongly Disagree = 1; Disagree = 2; Unsure = 3; Agree = 4; Strongly Agree = 5

Perceptions of Student Learning


Outcomes
Based on the survey results, student
perceptions of using clickers or class
discussion appear in Table 3. The sevenquestion perception survey, which used
a scale from 1 (strongly disagree) to 5
(strongly agree), was completed by all 92
participants. Although no statistically
significant differences occurred, the
mean scores were consistently higher
for students who had used clickers.

Recommendations for
Further Research
Despite the lack of statistically significant results in this study, the perception
survey data show that students perceive
value in the use of clickers and would
recommend their use in future classes.
Contrary to expectations, learning
outcomes of students using clickers did
not improve more than the traditional
active learning approach of using class

74

E D U C A U S E Q U A R T E R LY

Number 2 2007

discussion. Perhaps the value of the


active learning pedagogy outshadowed
the benefit of using clickers.
Another explanation might be the
instructors inexperiencethis was the
first time I had used clickers in the classroom. More research is needed to discover if clicker technology can enhance
the benefit of using traditional active
learning approaches. As the body of
research grows, the list of best practices
will also expand as instructors develop
new strategies to integrate clicker technology into their teaching practices.
The best way to help instructors is to
provide mentoring and support from
other instructors using clicker systems.
I plan to share all the course lecture
presentations using clickers designed
for this study with other instructors who
teach the same course. Other faculty
members can improve upon the learning materials rather than starting from
scratch. According to Beatty,

Sharing questions between instructors,


or even providing a library or model
curriculum of predesigned question
sets, can make a big difference to a
new instructor trying to climb a steep
learning curve.9
This type of collaboration will expedite future improvements, and further
research will determine their value in
active learning. e

Endnotes

1. R. W. Guthrie and A. Carlin, Waking


the Dead: Using Interactive Technology
to Engage Passive Listeners in the Classroom, Proceedings of the Tenth Americas
Conference on Information Systems, New
York, August 2004.
2. A. Chickering and Z. Gamson, Seven
Principles for Good Practice in Undergraduate Education, AAHE Bulletin, No.
39, 1987, pp. 37.
3. C. Johnson, Clickers in Your Classroom,
Wakonse-Arizona E-Newsletter, Vol. 3, No.
1, 2004, <http://clte.asu.edu/wakonse/
ENewsletter/studentresponse_idea.htm>
(retrieved January 24, 2007).
4. Ibid.
5. I. Beatty, Transforming Student Learning with Classroom Communication
Systems (Boulder, Colo.: EDUCAUSE
Center for Applied Research, Issue 3,
2004), p. 5, <http://www.educause.edu/
LibraryDetailPage/666?ID=ERB0403>.
6. J. West, Learning Outcomes Related to
the Use of Personal Response Systems
in Large Science Courses, Academic
Commons, December 9, 2005, <http://
www.academiccommons.org/commons/
review/west-polling-technology>
(retrieved January 24, 2007).
7. Johnson, op. cit., para. 8.
8. L. J. Robertson, Twelve Tips for Using
a Computerized Interactive Audience
Response System, Medical Teacher, Vol.
22, No. 3, 2000, pp. 237239; D. Duncan,
Clickers in the Classroom (Upper Saddle,
N.J.: Addison-Wesley, 2005); and Turning Technologies Audience Response Systems, Higher Education Best Practices,
<http://www.turningtechnologies.com/
highereducationinteractivelearning/
bestpractices.cfm> (retrieved January
24, 2007).
9. Beatty, op. cit., pp. 67.

Margie Martyn ([email protected]) is Assistant Professor, Mathematics and Computer


Science Department, at Baldwin-Wallace College in Berea, Ohio.

You might also like