A Research Framework To Study The Effect
A Research Framework To Study The Effect
Editors
Michael Simonson
Professor
Instructional Technology and Distance Education
Nova Southeastern University
Fischler Graduate School of Education and Human Services
North Miami Beach, FL
and
Margaret Crawford
Information Specialist
Mason City Public Schools
Mason City, IA
Previous Proceedings Published in ERIC
Year Location ED Number
1979 New Orleans 171329
1980 Denver 194061
1981 Philadelphia 207487
1982 Dallas 223191 – 223326
1983 New Orleans 231337
1984 Dallas 243411
1985 Anaheim 256301
1986 Las Vegas 267753
1987 Atlanta 285518
1988 New Orleans 295621
1989 Dallas 308805
1990 Anaheim 323912
1991 Orlando 334969
1992 Washington, D.C. 347970 – 348041
1993 New Orleans 362144
1994 Nashville 373774
1995 Anaheim 383284
1996 Indianapolis 397772
1997 Albuquerque 409832
1998 St. Louis 423819
1999 Houston 436128
1999 Long Beach 444595
2000 Denver 455756
2001 Atlanta 470066
2002 Dallas Submitted to ERIC
2003 Anaheim Submitted to ERIC
2004 Chicago Submitted to ERIC
ii
Preface
For the twenty-eighth year, the Research and Theory Division of the Association for Educational
Communications and Technology (AECT) is sponsoring the publication of these Proceedings. Papers
published in this volume were presented at the National AECT Convention in Orlando, FL. A limited
quantity of these Proceedings were printed and sold in both hardcopy and electronic versions.copies of both
volumes were distributed to Convention attendees on compact disk. Volume #1 will be available on
microfiche through the Educational Resources Clearinghouse (ERIC) System.
The Proceedings of AECT’s Convention are published in two volumes. Volume #1 contains papers dealing
primarily with research and development topics. Papers dealing with instruction and training issues are
contained in volume #2 which also contains over 100 papers.
REFEREEING PROCESS: Papers selected for presentation at the AECT Convention and included in these
Proceedings were subjected to a reviewing process. All references to authorship were removed from
proposals before they were submitted to referees for review. Approximately sixty percent of the
manuscripts submitted for consideration were selected for presentation at the convention and for
publication in these Proceedings. The papers contained in this document represent some of the most current
thinking in educational communications and technology.
M. R. Simonson
Editor
iii
2005 AECT Conference RTD Reviewer
Tonya Amankwatia Krista Glazewski Al P. Mizell
Gerald Burgess Michael Grant Gary Morrison
M. J. Bishop Janette Hill Zane Olina
Marcie Bober Brad Hokansen Gamze Ogozul
Jonathan Brinkerhoff Ann Igoe Andrea Peach
Abbie Brown Kethleeen Ingram Robert Reiser
Shirley Campbell Paul Kirschner Willi Savenye
Susan Colaric James Klein Rebecca Scheckler
Marcy Driscoll Dave Knowlton Michael Simonson
Jared Danielson Theodore Kopcha Andrew Smith
Peg Ertmer Tiffany Koszalka Michael Spector
Deniz Eseryl Kathryn Ley Howard Sullivan
Branda Friedan Nancy Maushak Ellen Taricani
Xun Ge Trey Martindale Lucinda Willis
Andrew Gibbons Joan Mazur
iv
Order Form
Name:________________________________________________________________________
Affiliation:____________________________________________________________________
Address:______________________________________________________________________
Shipping Address:______________________________________________________________
Shipping City, State, Zip: ________________________________________________________
Email Address: ________________________________________________________________
Phone Number: ________________________________________________________________
Additional Phone Number:________________________________________________________
Orders:
Please fill in and return, with payment, to Michael Simonson, Nova Southeastern University
1750 NE 167th Street. North Miami Beach, FL 33162
Make checks payable to ‘Proceedings’.
VOLUME #1: RESEARCH PAPERS
Printed Version: Number of Copies_____________ @$80.00 each, Total __________
vi
HOW CAN WE FACILITATE STUDENTS’ IN-DEPTH THINKING AND INTERACTION IN AN
ASYNCHRONOUS ONLINE DISCUSSION ENVIRONMENT? A CASE STUDY
WING SUM CHEUNG, KHE FOON HEW ............................................................................................................113
A NOVICE ONLINE INSTRUCTOR’S ONLINE TEACHING EXPERIENCE: A CASE STUDY
HEE JUN CHOI, JI-HYE PARK...........................................................................................................................121
HOW CAN TECHNOLOGY HELP IMPROVE THE QUALITY OF BLACKBOARD FACULTY
TRAINING AND ENCOURAGE FACULTY TO USE BLACKBOARD?
DORIS CHOY, JUDY XIAO, JOHN ILIFF .............................................................................................................130
TRENDS, ISSUES, AND GUIDELINES FOR THE QUARTERLY REVIEW OF DISTANCE
EDUCATION (QRDE)
DANIEL EASTMOND, CHARLES SCHLOSSER, MICHAEL SIMONSON ..................................................................135
EXEMPLARY TECHNOLOGY USE: TEACHERS' PERCEPTIONS OF CRITICAL FACTORS
PEGGY A. ERTMER, ANNE OTTENBREIT-LEFTWICH, CINDY S. YORK .............................................................142
IMPACT AND PERCEIVED VALUE OF PEER FEEDBACK IN ONLINE LEARNING
ENVIRONMENTS
PEGGY A. ERTMER, JENNIFER C RICHARDSON, BRIAN BELLAND, DENISE CAMIN, PATRICK CONNOLLY, GLEN
COULTHARD, KIMFONG (JASON) LEI, CHRISTOPHER MONG............................................................................149
TOWARD IMPROVING INFORMATION TECHNOLOGY ACCEPTANCE
IOAN GELU IONAS ...........................................................................................................................................191
ONLINE FOLLOW-UP TO PROFESSIONAL DEVELOPMENT FOR TEXAS SCHOOL
LIBRARIANS: THE VALUE OF A COLLABORATIVE LEARNING ENVIRONMENT
MARYBETH GREEN AND LAUREN CIFUENTES .................................................................................................160
A CROSS-MEDIA AND CROSS-CULTURAL STUDY ON THE EFFECT OF STUDENT-
GENERATED VISUALIZATION AS A STUDY STRATEGY
YI-CHUAN JANE HSIEH, LAUREN CIFUENTES..................................................................................................166
SELF-REGULATION IN WEB-BASED DISTANCE EDUCATION: FROM A REQUIREMENT TO AN
ACCOMPLISHMENT
HAIHONG HU ..................................................................................................................................................172
WILL SELF-REGULATED LEARNING STRATEGY TRAINING BE USEFUL IN A WEB-BASED
DISTANCE EDUCATION COURSE?
HAIHONG HU ..................................................................................................................................................199
A RESEARCH FRAMEWORK TO STUDY THE EFFECTS OF CONTEXTUALIZED
IOAN GELU IONAS ...........................................................................................................................................201
JOURNAL CONTENT ANALYSIS OF ETR&D AND THE JOURNAL OF EDUCATIONAL
TECHNOLOGY OF KOREA
IL-HYUN JO .....................................................................................................................................................211
EXPLORING THE NATURE OF TRAINING DISTANCE EDUCATION FACULTY
HAIJUN KANG .................................................................................................................................................218
EFFECTS OF ANIMATION ON MULTI-LEVEL LEARNING OUTCOMES: A META-ANALYTIC
ASSESSMENT AND INTERPRETATION
FENGFENG KE, HUIFEN LIN, YU-HUI CHING, FRANCIS DWYER ......................................................................225
vii
EFFECTS OF INTEGRATED MOTIVATIONAL AND VOLITIONAL TACTICS ON STUDY
HABITS, ATTITUDES, AND PERFORMANCE
JOHN M. KELLER, MARKUS DEIMANN, ZHU LIU .............................................................................................234
THE EFFECTS OF CRITICAL REFLECTION THINKING ON LEARNER'S METACOGNITION
AARON KIM ....................................................................................................................................................242
USING MOTIVATIONAL AND VOLITIONAL MESSAGES TO PROMOTE UNDERGRADUATE
STUDENTS’ MOTIVATION, STUDY HABITS AND ACHIEVEMENT
CHANMIN KIM, JOHN M. KELLER, HUEI YU CHEN .........................................................................................247
THE RELATIONSHIP BETWEEN PRESERVICE TEACHERS’ PERCEPTIONS OF FACULTY
MODELING OF COMPUTER-BASED TECHNOLOGY AND THEIR INTENT TO USE COMPUTER-
BASED TECHNOLOGY
KIOH KIM, LANDRA REZABEK, GUY WESTHOFF, JOHN COCHENOUR .............................................................253
EXPLORING THE VISION OF TECHNOLOGY INTEGRATION RESEARCH: SCHOLARS’
THOUGHTS ON DEFINITIONS, THEORIES, AND METHODOLOGIES
TIFFANY A. KOSZALKA, KERSTIN MUKERJI ....................................................................................................264
SCAFFOLDING REFLECTION ON EVERYDAY EXPERIENCES: USING DIGITAL IMAGES AS
ARTIFACTS
SUSAN M. LAND, BRIAN K SMITH, BRIAN BEABOUT, SUNGHYUN PARK, KYOUNG NA KIM ...........................268
INFLUENCE OF ELECTRONIC TEXT PRESENTATION MODE ON STUDENT UNDERSTANDING
AND SATISFACTION; AN EMPIRICAL STUDY IN KOREAN CONTEXT
HYE-JUNG LEE, JEONG-WON WOO .................................................................................................................275
LEARNING TO COLLABORATE, COLLABORATIVELY: AN ONLINE COMMUNITY BUILDING
AND KNOWLEDGE CONSTRUCTION APPROACH TO TEACHING COMPUTER SUPPORTED
COLLABORATIVE WORK AT AN AUSTRALIAN UNIVERSITY
MARK J.W. LEE, KEN EUSTACE, LYN HAY, GEOFF FELLOWS .........................................................................286
IMPACT OF TECHNOLOGY-MEDIATED PEER ASSESSMENT ON STUDENT PROJECT
QUALITY
LAN LI AND ALLEN L. STECKELBERG .............................................................................................................307
FACTORS THAT INFLUENCE LEARNERS’ PERFORMANCE
KYU YON LIM, HYEON WOO LEE, IL-HYUN JO ...............................................................................................314
FOSTERING COMMUNITIES OF PRACTICE
WEI-YING LIM, JOHN G HEDBERG, JENNIFER AIR-CHOO YEO, DAVID HUNG.................................................320
LEARNING “PRAGMATICS” ON-LINE THROUGH PARTNERSHIP: A CROSS-CULTURAL
STUDY BETWEEN TAIWANESE COLLEGE STUDENTS AND THEIR TEXAN TUTORS
CHIA-NING JENNY LIU, YI-CHUAN JANE HSIEH, ZOHREH ESLAMI-RASEKH ...................................................328
DIFFERENT DOES NOT MEAN WRONG
MELISSA MOHAMMED AND ELEANOR ENNIS ..................................................................................................336
USER SUPPORT DESIGN TO PROVIDE A CHANCE TO LEARN
MOMOKO NAKATANI, MASARU MIYAMOTO, SHUNICHI YONEMURA ..............................................................340
THE INTERPLAY BETWEEN INSTRUCTOR BELIEFS ABOUT “BEST PRACTICES” IN
TEACHING AND ACTUAL PRACTICES IN ONLINE LEARNING: A CASE STUDY IN HIGHER
viii
EDUCATION
BESSIE NKONGE ..............................................................................................................................................346
THE NEW GLOBAL KNOWLEDGE SOCIETY AND...............................................................................353
EUNHEE J. O’NEILL.........................................................................................................................................353
PRE-SERVICE TEACHERS: DEVELOPMENT OF PERCEPTIONS AND SKILLS PERTAINING TO
TECHNOLOGY INTEGRATION .................................................................................................................364
ANNE T. OTTENBREIT-LEFTWICH, CINDY S YORK, JENNIFER C RICHARDSON, TIMOTHY J NEWBY ...............364
EVALUATION OF A PRE-SERVICE EDUCATIONAL TECHNOLOGY PROGRAM ........................371
CHENG-CHANG PAN, MICHAEL SULLIVAN, JUAN HINOJOSA ..........................................................................371
EXAMINING BARRIERS MIDDLE SCHOOL TEACHERS ENCOUNTERED IN TECHNOLOGY-
ENHANCED PROBLEM-BASED LEARNING ...........................................................................................377
SUNG HEE PARK, MONICA (EUN HWA) LEE, JAY BLACKMAN, PEG ERTMER, SCOTT SCHAFFER, KRISTA
SIMONS, BRIAN BELLAND ...............................................................................................................................377
CHAOS THEORY AND THE SCIENCES OF COMPLEXITY:................................................................384
CHARLES M. REIGELUTH ................................................................................................................................384
USING ASP FOR WEB SURVEY DATA COLLECTION AND AN EMPIRICAL ASSESSMENT OF
THE WEB DATA .............................................................................................................................................390
XUEJUN SHEN, HOPE RUIFANG ADAMS ..........................................................................................................390
TAKING STATISTICS DOESN’T HAVE TO BE SCARY: KEEPING THE HEARTRATE DOWN...396
HEEGYOUNG SONG, JOHN R SLATE ................................................................................................................396
CONFIGURING GRAPHIC ORGANIZERS TO SUPPORT HIGHER-ORDER THINKING SKILLS
............................................................................................................................................................................407
CAMERON SPEARS, GREGORY MOTES, WILLIAM A KEALY ............................................................................407
INVESTIGATING HISPANIC PRE-SERVICE TEACHERS’ SELF-EFFICACY FOR TECHNOLOGY
INTEGRATION ON THE SUBSCALE LEVEL ..........................................................................................415
MICHAEL SULLIVAN, CHENG-CHANG PAN, JUAN HINOJOSA ..........................................................................415
A PRELIMINARY STUDY OF THE USES AND EFFECTS OF MOBILE COMPUTING DEVICES IN
K-8 CLASSROOMS.........................................................................................................................................419
KAREN SWAN, MARK VAN 'THOOFT, ANNETTE KRATCOSKI, DARLENE UNGER .............................................419
THE CHANGING NATURE OF LEARNING IN A UBIQUITOUS COMPUTING CLASSROOM .....425
KAREN SWAN, YI MEI LIN, JASON SCHENKER, MARK VAN 'T HOOFT.............................................................425
ONLINE LEARNING PROGRAMS AS LEARNING ORGANIZATIONS: A CASE STUDY OF
INFORMATION MANAGEMENT PROGRAMS AT ANADOLU UNIVERSITY, TURKEY...............436
DENIZ TASCI, CENGIZ HAKAN AYDIN .............................................................................................................436
EFFECTS OF COMPUTER INTEGRATION TRAINING AND COMPUTER LITERACY TRAINING
ON PRESERVICE TEACHERS' CONFIDENCE AND KNOWLEDGE RELATED TO TECHNOLOGY
USE ....................................................................................................................................................................442
JEREMY I. TUTTY, JAMES D KLEIN, HOWARD SULLIVAN ................................................................................442
DEVELOPING AND EVALUATING AN INTERACTIVE MULTIMEDIA INSTRUCTIONAL TOOL:
HOW IS THE LEARNING QUALITY OF OPTOMETRY STUDENTS IMPACTED?..........................451
LING WANG, BAI-CHUAN JIANG .....................................................................................................................451
ix
INTERTWINING THE FABRICS OF LEARNING STYLES, PERSONALITY TYPES, BLOOM’S
TAXONOMY, AND MULTIPLE INTELLIGENCES .................................................................................458
ROSALIE CARTER WARD .................................................................................................................................458
TABLET PC INITIATIVE: IMPACT ON STUDENTS AND LEARNING .............................................469
DOUGLAS C. WILLIAMS, DENISE BENTON, SUSAN PEDERSEN ........................................................................469
UNDERSTANDING COMPUTER ANXIETY AND COMPUTER RELATED EXPERIENCE: THE
MODEL AND PRACTICES ...........................................................................................................................477
HARRISON HAO YANG ....................................................................................................................................477
EVALUATION OF LITERACY LOG AND DISCUSSION BOARD POSTINGS IN ONLINE
LEARNING ......................................................................................................................................................484
YUANMING YAO, YEDONG TAO, VASSILIKI ZYGOURIS-COE, DONNA BAUMBACH .........................................484
A CASE STUDY OF FACILITATORS’ ATTITUDES TOWARD EFFECTIVENESS OF DIFFERENT
MEDIA USED IN ONLINE DEGREE PROGRAMS...................................................................................492
ULKU YILMAZ, CENGIZ HAKAN AYDIN ..........................................................................................................492
AN EXAMINATION OF CLASSROOM COMMUNITY SCALE: RELIABILITY AND FACTOR
STRUCTURE ...................................................................................................................................................497
HUANG-CHIH YU, FENGFENG KE ....................................................................................................................497
x
Expository and Discovery Learning Compared: Their Effects on Learning
Outcomes of Online Students
Omur Akdemir
Tiffany A. Koszalka
Syracuse University
Abstract
Researchers compared the effects of expository and discovery types of instructional strategies on learning
outcomes of adult online students. Statistically no significant results were found between the reported perceived
learning outcomes, and effort and involvement of adult online students completing expository and discovery course
modules. Implications of this research for online course designers are presented in this paper.
Introduction
Availability of communication technologies has generated growing interests in the use of distance
education methods to reach larger student populations. Numerous universities, and school districts have started to
offer online courses to meet the growing needs of education in responding to demands for flexible learning
environments (Gunawardena & McIsaac, 2004). Online courses provide opportunities for individuals who would
otherwise not have opportunities for learning (Deal, 2002). Offering courses on the Internet, however, has brought
many challenges for instructors and instructional designers since designing meaningful and effective learning
environments on the Internet is a challenging task (Hill, Wiley, Nelson, & Han, 2004).
Educators and instructional designers use different types of instructional strategies to help learners acquire
knowledge in a most efficient and effective way. By definition, instructional strategies describe the general
components of a set of instructional materials and a set of decision that result in plan, method, or series of activities
aimed at obtaining a specific goal (Dick & Carey, 1978; Witkin et al., 1977). Implementation of online courses
should be achieved through careful analysis of instructional methods used in online courses (McCarron, 2000). "If
an instructional experience or environment does not include the instructional strategies required for the acquisition
of the desired knowledge or skill, then effective, efficient, and appealing learning of the desired outcome will not
occur" (Merrill, Drake, & Lacy, 1996).
The primary purpose of this exploratory study was to investigate effects of expository and discovery types
of instructional strategies on learning outcomes for online students. Two research questions and accompanying
experimental hypotheses tested in this study were:
1. Is there a difference between the reported perceived learning outcome of adult students completing
expository online course module and the reported perceived learning outcome of adult students completing
discovery online course module?
Ho1: Reported perceived learning outcomes of adult students completing the online course module which
used expository instructional strategies are the same as reported perceived learning outcomes of adult
students completing the online course module, which used discovery instructional strategies.
Ha1: Reported perceived learning outcomes of adult students completing the online course module which
used expository instructional strategies are different from the reported perceived learning outcomes of
adult students completing the online course module, which used discovery instructional strategies.
2. Is there a difference between the reported effort and involvement measure of adult students completing
expository online course module and the reported effort and involvement measure of adult students
completing discovery online course module?
Ho2: Reported effort and involvement measure of adult students completing the online course module
which used expository instructional strategies are the same as reported effort and involvement measure of
adult students completing the online course module, which used discovery instructional strategies.
Ha2: Reported effort and involvement measure of adult students completing the online course module
which used expository instructional strategies are different from the reported effort and involvement
1
measure of adult students completing the online course module, which used discovery instructional
strategies.
Investigating the effects of these instructional strategies in online courses is important to identify effective
instructional strategies for adult students who learn online.
When expository types of instructional strategies are used, instruction usually begins with the introduction
of the concept. Then the structure of the material is presented in order to guide the students. Afterwards, students are
given a general orientation to the material, a conceptual framework, and some key ideas to work with (Andrews,
1984). Information is presented within a context. Following the presentation of the information, students are asked
to apply the general concepts in organizing the information.
As oppose to the expository learning, the basic assumption of discovery learning instructional strategies are
that students can learn better when they are given opportunity to generate conclusions inductively from ambiguous
materials (Andrews, 1984). The advocates of discovery method proclaim that the key for effective learning is to
teach individuals how to discover answers themselves. Incorporating scenarios that are realistic in the instruction
activates individual’s curiosity and increases their motivation. Discovery types of instructional strategies require
individuals to make decision and solve problems (Johnson & Aragon, 2003).
Method
Instructional Context
Three graduate courses from the Instructional Technology department of a private university located in the
northeastern United States were used in this study. The Instructional Technology department was one of the pioneers
in the private university offering graduate level online courses. Online courses had been offered in the Instructional
Technology department for years. The professors of the Instructional Technology department had variety of
experience designing and offering online courses. Two online course management systems, WebCT and
Blackboard, were used to deliver online courses in the department. Using the expository and discovery instructional
strategies, identical online course modules were designed for online courses. An experienced instructional designer
reviewed developed modules to ensure that they represent the characteristics of two instructional strategies.
Subjects
The study was conducted with thirty-five adult students taking graduate courses from the Instructional
Technology department of a private university located in the United States. All participants were twenty-five years
old and older. The majority of participating subjects were females. Females constituted 69% of participants, and
31% of participants were males in this study.
Instrument
The online module evaluation form was used to measure the perceived learning outcomes of students, and
effort and involvement of adult students. The online module evaluation form was adapted from the Student
Instructional Report II (Centra, 1998). The reported test-retest reliability of the instrument to measure perceived
learning outcomes of students is ranged from .78 to .93 and the reliability of instrument to measure the effort and
involvement of students is .88 (Centra, 1998). A java script was written to ensure that students respond all questions
in the form before submitting it. The results of online module evaluation form were then automatically emailed to
researchers once students completed it.
Procedure
A convenient sampling was used to select online courses from the Instructional Technology department of
a private university. After receiving the Human Subjects Approval to conduct the study, researchers used emails and
personal visits to contact the course professors and explained the purpose of the study. Three graduate courses
whose professors showed interest and agreed to integrate designed modules in their courses were used to conduct the
study.
Adult students from three courses completed the expository and discovery online course modules
successively. After each online course module, students completed the online evaluation form to report their
perceived learning outcomes, and their effort and involvement. The results of online evaluation form were
automatically emailed to researchers. The data of students who gave permission to researchers were used in the
study.
2
Analysis
A statistical analysis package (SPSS release 12) was used to test the experimental hypotheses. All the data
received through the emails were ported into the statistical analysis package. Paired sample t-tests were conducted to
test the two experimental hypotheses. All statistical analysis reported in this research were conducted with a
significant level of .05.
Findings
The first hypothesis stated that reported perceived learning outcomes of adult students completing the
online course module which used expository instructional strategies were the same as reported perceived learning
outcomes of adult students completing the discovery online course module The result of the paired sample t-test
supported this hypothesis. No significant differences were found when adult students’ reported perceived learning
outcomes were compared in expository and discovery course module. The null hypothesis was not rejected t (34) = -
1.78, p> 0.05 (See Table 1). Table 2 presents the descriptive statistics for the perceived learning outcomes of adult
students in expository and discovery online course module. The change in mean scores of participants in expository
and discovery online course modules is presented in Figure 1.
Table 1. The results of the paired sample t-test for reported perceived learning outcomes of adult students in
expository and discovery online course modules
Sig
Paired Differences T df (2-tailed)
Std. 95% Confidence
Std. Error Interval of the
Mean Deviation Mean Difference
Lower Upper
Pair1 Expository-
-1.54 5.11 .86 -3.3 .21 -1.78 34 .083
Discovery
Table 2. The descriptive statistics for reported perceived learning outcomes of adult students in expository and
discovery online course modules
Mean N Std. Std. Error
Deviation Mean
Pair 1 Expository 18.85 35 4.54 .76
Discovery 20.4 35 3.28 .55
21
Mean Scores
20
19
18
Expository Discovery
Figure 1. The graph of distribution of participants’ mean perceived learning outcome scores by types of instructional
strategy
The second hypothesis stated that effort and involvement measures of adult students completing the online
course module which used expository instructional strategies were the same as effort and involvement measures of
3
adult students completing the online course module, which used discovery instructional strategies. The paired
sample t-test was run to test the second hypothesis. No significant differences were found when adult students’
effort and involvement measure was compared in expository and discovery course module. The null hypothesis was
not rejected t (34) = -1.83, p> 0.05 (See Table 3). Table 4 shows the descriptive statistics for the effort and
involvement measure of adult students in expository and discovery online course module. Figure 2 presents the
change in mean scores of participants in expository and discovery online course modules.
Table 3. The results of the paired sample t-test for reported effort and involvement measure of adult students in
expository and discovery online course modules
Sig
Paired Differences T df (2-tailed)
Std. 95% Confidence
Std. Error Interval of the
Mean Deviation Mean Difference
Lower Upper
Pair1 Expository-
-0.91 2.94 .49 -1.92 .09 -1.83 34 .075
Discovery
Table 4. The descriptive statistics for reported effort and involvement measure of adult students in expository and
discovery online course modules
Mean N Std. Std. Error
Deviation Mean
Pair 1 Expository 11.14 35 3.03 .51
Discovery 12.05 35 2.35 .39
13
Mean Scores
12
11
10
Expository Discovery
Figure 2. The graph of distribution of participants’ mean effort and involvement scores by types of instructional
strategy
Conclusion
The effectiveness of different instructional strategies in various settings has been studied by researchers
(Andrews, 1984; Hopkins, 2002; MacNeil, 1980). Investigating the effects expository and discovery formats on
college students taking face-to-face courses, Andrew (1984) discovered that field independent students
outperformed the field dependent students in the discovery format while field dependent students did better than
field independent students in expository format. Using undergraduate students as subjects in the face-to-face
courses, MacNeil (1980) found no difference when the effects of expository and discovery instructional strategies on
the change in learning performance of field dependent and independent subjects were investigated. Findings of these
studies in face-to-face courses are contradictive. Comparing the effectiveness of expository and discovery format on
computer-based instruction, Hopkins (2002) found no difference between the expository and discovery format for
4
undergraduate student population. The effects of the expository and discovery types of instructional strategies were
investigated on online courses with adult students in this study which also suggests no difference in perceived
learning between the different instructional strategies.
The lack of understanding how online environments can be designed to be most effective is problematic for
instructors as well as students. This research studied this problem through a focused investigation of the effects of
expository and discovery learning on adult online students’ perceived learning outcome measure, and their effort
and involvement measure. The no statistically significant difference does shed light on instructional design and
development issues.
The results of this study suggested that using the expository and discovery types of instructional strategies
in online courses did not affect adult students’ perceived learning outcome. Both expository and the discovery
format seem to foster student learning in the same manner. If similar results can be achieved for the larger adult
student population taking online courses, there may be implications for designing instruction for online courses.
Since the effects of expository or discovery course modules on adult students’ perceived learning outcome were not
different, online course instructors and instructional designers are suggested to either use the expository or the
discovery types of instructional strategies to design online courses for adult students to provide similar learning
benefits. This study also suggests that instructional strategies are not necessarily the most important factor to adults
in online courses since no significant differences were found in this study which suggests that instructional designers
should focus more on content and rich activities that can appeal to a wide variety of learning.
Future researchers should consider changing the order of instructional course modules since in this study
adult online students completed the expository course module first and then they completed the discovery course
module. Therefore changing the order of instructional strategies may produce different results. Also testing the
effects of these instructional strategies using different content may have different effects on the findings. Moreover,
other instructional strategies such as problem-based learning, collaborative learning, and generative instructional
strategies should also be examined in similar studies with adult students. Therefore other types of instructional
strategies should be tested using different content and ordering effects of each instructional strategy should also be
considered in the research design process.
This study is among the few studies conducted to date in online courses with adult students to identify
effective instructional strategies for online courses. Results of this study and similar studies will guide the
instructional designers in designing effective and appealing online courses. Findings of this study suggested that
instructional strategies in online courses may not be a significant factor affecting adult student’s learning. Therefore
online course instructors and instructional designers should focus on designing instruction where the goal of the
instruction is consistent with the strategies used to teach this goal to achieve optimal learning (Merrill, 1999).
References
Andrews, J. D. W. (1984). Discovery and expository learning compared: Their effects on independent and
dependent students. Journal of Educational Research, 78(2), 80-89.
Centra, J. (1998). The development of student instructional report II: Educational Testing Service.
Deal, W. F., III. (2002). Distance learning: Teaching technology online. Resources in Technology. Technology
Teacher, 61(8), 21-26.
Dick, W., & Carey, L. (1978). The systematic design of instruction. Glenview, Ill.: Scott Foresman.
Gunawardena, C. N., & McIsaac, M. S. (2004). Distance education. In D. H. Jonassen (Ed.), Handbook of Research
on Educational Communications and Technology (2nd ed., pp. 355-395). Mahwah, N.J.: Lawrence
Erlbaum.
Hopkins, M. T. (2002). The Effects of Computer-Based Expository and Discovery Methods of Instruction on Aural
Recognition of Music Concepts. Journal of Aural Recognition of Music Concepts, 50(2), 131-144.
Hill, J. R., Wiley, D., Nelson, L. M., & Han, S. (2004). Exploring research on internet-based leaning: From
infrastructure to interactions. In D. H. Jonassen (Ed.), Handbook of Research on Educational
Communications and Technology (2nd ed., pp. 433-460). Mahwah, N.J.: Lawrence Erlbaum.
Johnson, S. D., & Aragon, S. R. (2003). An instructional strategy framework for online learning environments. New
Directions for Adult and Continuing Education(100), 31-43.
McCarron, K. R. (2000). Perceiving the artifact within a virtual museum collection: Cognitive styles and online
instructional strategies. DAI, 60(12A), 146.
MacNeil, R. D. (1980). The Relationship of Cognitive Style and Instructional Style to the Learning Performance of
Undergraduate Students. Journal of Educational Research, 73(6), 354-359.
Merrill, M. D., Drake, L., & Lacy, M. J. (1996). Reclaiming instructional design. Educational Technology, 36, 5-7.
Merrill, M. D. (1999). International Forum of Educational Technology & Society: Learning Strategies Then and
5
Now: Same or Different?. Retrieved July 15, 2005, from http://ifets.ieee.org/discussions/discuss7.html.
Witkin, H. A., Moore, C. A., Goodenough, D. R., & Cox, P. W. (1977). Field-dependent and field-independent
cognitive styles and their educational implications. Review of Educational Research, 47(1), 1-64.
6
An Experimental Evaluation of Tutorials in Problem Solving (TiPS): A
Remedial Mathematics Tutor
Robert K. Atkinson
Arizona State University
Mary Merrill-Lusk
Louisiana State University i
Abstract
Our laboratory conducted an experimental evaluation of Tutorials in Problem Solving (TiPS), a computer
environment representing a schema-based approach to training arithmetic and problem solving skills in remedial
adult populations. Specifically, this project was designed to accomplish several objectives: (1) document the level of
instructional adaptation provided by the TiPS system; (2) provide a basic evaluation of the overall TiPS system; (3)
determine the amount of instructional time involved with the use of TiPS system; and (4) determine the affective
nature of the TiPS learning experience. According to our evaluation project: (1) TiPS does monitor the learners’
performance and adapt instructional delivery to meet their needs; (2) the average posttest score for participants in a
TiPS group was significantly higher than their peers in a untreated control group; (3) the average time for
completing the Tips computer instruction was 3.84 hours (SD = 0.97) and ranged from 1.83 to 6 hours (excluding
test-taking time); and (4) the learners exposed to TiPS reported feeling that the instructional material, including the
examples and problems, helped them to understand how to approach and solve word problems and that, overall, the
instructional environment was well designed.
The purpose of this project was to conduct an experimental evaluation of TiPS (Tutorials in Problem
Solving), a computer environment representing a schema-based approach (e.g., Marshall, 1995) to training
arithmetic and problem solving skills in remedial adult populations. Specifically, this project was designed to
accomplish several objectives: (1) document the level of instructional adaptation provided by the TiPS system, (2)
provide a basic evaluation of the overall TiPS system (3) determine the amount of instructional time involved with
the use of TiPS system, and (4) determine the affective nature of the TiPS learning experience.
To accomplish these objectives, this project involved two phases in which rigorous empirical standards
were applied during each phase. The initial phase, which focused on addressing the first objective, involved an
analytic investigation of how the TiPS system behaves in response to various types of simulated performance. This
analytic investigation was based on iterative “user” trials where graduate students in our lab used the TiPS system
while adopting the behavior of “users” with a wide-range of ability levels. The second phase of our project
addressed the remaining three objectives through the employment of a regression-discontinuity (RD) design, a
pretest-posttest program-comparison group strategy where participants are assigned to program or comparison
groups solely on the basis of a cutoff score on a pre-program measure (i.e., pretest).
Background
On TiPS, students receive instruction within the context of problem solving scenarios, designed to
gradually build their skills and abilities that should enable them to reason about complex real-world problems. The
instructional objectives of TiPS instruction include fostering everyday mathematics and problem-solving skills. In
particular, these objectives include fostering the development of: (a) arithmetic schemas—conceptual structures
shown by cognitive research to underlie human understanding of mathematics; (b) self-monitoring ability—the
tendency and ability to be aware of one’s own level of understanding and to check one’s problem-solving
performance to avoid careless errors; (c) supporting beliefs—the maintenance and use of beliefs associated with
good problem solving; (d) selective encoding ability—the ability to identify important information and exclude
extraneous information from the problem statement or situation; (e) strategic search ability—the ability to recognize
the need for carrying out searches for problem information available through common indexed sources; and (f)
strategic planning ability—the ability to select and organize schemas into solution steps that achieve an overall
conceptualization of a solution to complex problems.
Despite the potential of TiPS to address a wide range of instructional objectives, several controlled field
7
studies designed to test the pedagogical efficacy of the TiPS learning environment have generated results that are
suggestive at best. One field study, which entailed high school students working with TiPS, identified that one
important issue related to its effectiveness is the appropriate level of ability for students given TiPS instruction.
Students that started with high math ability prior to the TiPS instruction—as evidenced by a high pretest score—did
not appear to benefit from the system. On the other hand, there was evidence to suggest that students that struggled
with the pretest and indicated that they were either in the middle or bottom third of the math classes they took in
high school improved their performance on the posttest instruments. This represents a potential indicator of learning
from the TiPS instruction. It is possible that these students derived greater benefit from the TiPS instruction, albeit
this could also have been an example of regression towards the mean as well. Another field study conducted in
conjunction with the Madison Area Technical College (MATC) adult literacy program did not generate significant
gain scores (pretest to posttest differences). However, the results of a post-instruction questionnaire suggested that
the participants enjoyed working with the system. In particular, the participants indicated that they liked the
system’s worked examples, thought the system helped them solve the posttest problems, and would recommend the
system to a friend. Against this background, this project was designed to replicate and extend these initial findings.
Technical Approach
As previously mentioned, this project was designed to accomplish several objectives: (1) document the
level of instructional adaptation provided by the TiPS system, (2) provide a basic evaluation of the overall TiPS
system (3) determine the amount of instructional time involved with the use of TiPS system, and (4) determine the
affective nature of the TiPS learning experience. To achieve these objectives, this project involved two phases:
Phase 1 focused on addressing the first objective by relying on an analytic investigation of how the TiPS system
behaves in response to various types of simulated performance; and Phase 2 addressed the remaining three
objectives in the context of a regression-discontinuity (RD) design, a powerful pretest-posttest program-comparison
group design that minimizes regression to the mean as a threat to internal validity. Each of these phases is described
in detail below.
8
(SPS) interface. The SPS interface is designed to provide users with a small set of conceptually distinct diagrams for
displaying and solving arithmetic problems. This approach represents a natural extension of classic expert/novice
studies, which characterized good problem solvers as those possessing many conceptually rich knowledge
structures, or “schemas,” related to their domain of expertise. Marshall viewed schemas as abstract structures
(instantiated either mentally or externally) that: (1) represent fundamental relational concepts within a domain; (2)
suggest the existence of different problem classes; (3) suggest procedures associated with problem types; and (4)
serve as conceptual building blocks for representing complex problems. The goal of SPS was to help students
construct expert math knowledge by having them solve and analyze math story problems employing schematic
diagrams representing basic semantic concepts.
The TiPS graphical user interface (GUI) supplies five schematic diagrams designed to serve as conceptual
support for problem solving. Similar to SPS, each of the five TiPS core diagrams represents a different basic
mathematics schema. The design for the diagrams was based on empirical evidence showing performance
differences in problems solvable with the same arithmetic operations, but that have different semantics. The group
relation, the comparison type, and change relation form a set of three additive classes found in arithmetic story
problems. In addition, in both TiPS and SPS there are multiplicative situation classes as well, represented by restate
(linear function) and vary (proportions) diagrams.
TiPS has a series of lessons associated with the goals of learning the five problem types and how to analyze
and solve problems with their associated TiPS schemas. The system consists of two sets of lessons: (1) basic schema
tool lessons and (2) advanced lessons. In the basic schema tool lessons, there is one instructional unit each for the
Change, Group, Compare, Vary and Restate tools. With the exception of the Vary unit, each instructional unit
consists of one lesson. There are four lessons associated with the Vary unit (i.e., proportion, rate, percent, and “slice
of life problems”). Thus, there are a total of eight lessons that make up the basic schema tool lessons. The advance
lessons consist of two lessons, one that involves a set of mixed one-step problems representing the five problem
types and another that involves multi-step problems. In the mixed one-step advanced lesson, the practice set requires
students to discriminate among all five problem types for tool schema selection. The multi-step advanced lesson
focuses on advanced problem solving, including problem solving practice with complex, multi-step problems.
Within these lessons, students study dynamic worked examples that illustrate expert problem solving on
TiPS, and they complete practice problems that are similar to the worked examples. The worked examples illustrate
desired problem solving performance with didactic audio explanation from a tutor. The worked examples
incorporate the schematic diagrams used in the problem-solving interface. Thus, they are designed to illustrate a
schema-based approach to problem solving. The lessons also include a set of 6-9 regular practice problems plus up
to 4 optional skill building practice problems.
Students receive hints and evaluative feedback designed to help them learn the problem solving skills
associated with each lesson. These hints are provided by a local evaluation component of TiPS. The TiPS system
also has a global evaluation component. These two components of evaluation are independent yet interrelated. The
local evaluation component is concerned with accurate diagnosis and feedback on a particular problem whereas the
global evaluation component is concerned with building a picture of overall problem solving competency and
behaviors of the student over time. The local evaluation component is provided by the cognitive diagnoser
described previously. The heart of the global evaluation component is represented by a Bayesian network. Both the
local and the global evaluation components work together. Based on its integrated Bayesian student model, TiPS is
capable of adapting the system to tailor its actions in several ways. It can adjust hints, evaluation scores, and other
feedback provided during local problem solving; it is capable of adjusting the mastery level communicated to the
student; and it may eliminate (indicated by grayed buttons) or add (indicated by active buttons) recommended
practice problems.
Procedure
As previously mentioned, the graduate students interacted with the TiPS system while simulating the
behavior of users with a wide-range of ability levels and the progress through the system was systematically
observed. We started the process simulating students on the extremes of the ability continuum and observing how
TiPS adapts its instruction accordingly. For instance, we examined what happens if a “student” performs perfectly—
does that student get asked to solve every problem, get rapidly advanced to the “advanced” lessons or not. We also
examined what happens if a simulated student makes lots of errors, systematic (on a certain type of problem) or not.
We continued this process by introducing simulated users with less extreme ability levels until we documented
under what conditions the Bayesian inference network was able to adjust rapidly to the characteristics of whatever
student it encounters.
9
Results
The tutorial program was completed under six different circumstances with a focus on different parameters
for the various runs. The first two runs were designed simply to determine how the program reacts to students who
work all problems correctly versus students who make continuous errors. The third run focused on discrepancies in
scoring, primarily due to mislabeling, entering data in the wrong use of hints. The final two runs were conducted by
students and recorded using pcAnywhere software.
10
In sum, on the positive side, certain aspects Bayesian Student Model worked properly. TiPS did
consistently adjust the mastery level communicated to the student and eliminated (as indicated by grayed buttons)
and added (as indicated by active buttons) recommended problems and practice problems. This analysis also
suggested that TiPS was relatively tolerant to solutions that were conceptually accurate but deviated in some
superficial way from the “expert” solution presented by TiPS. On the negative side, the most glaring problem is the
fact that the rate and level of hint use does not appear to inform the Bayesian Student Model. Instead, a student
could use hints to complete the entire tutorial, while along the way be required to solve the least number of overall
problems and achieve a perfect score—all without having to cogitate on a single problem.
11
these students was 18.10 (SD = 3.07). Of the 41 participants in this condition, 19 were Caucasian, 21 were
African American, and 1 described himself as “other.” Eleven of the 41 participants classified themselves
as lower-division undergraduates with the rest classifying themselves as upper-division undergraduates.
• 41 students (11 males and 30 females) that were eligible for TiPS but were not randomly selected to receive
TiPS treatment (i.e., eligible/untreated group). The average ACT math score for these students was 18.36
(SD = 3.18). ). Of the 41 participants in this condition, 19 were Caucasian, 20 were African American, and
2 described themselves as “other.” Thirteen of the 41 participants classified themselves as lower-division
undergraduates with the rest classifying themselves as upper-division undergraduates.
• 116 students (42 males and 73 females) in the comparison or untreated control group . The average ACT
math score for these students was 22.80 (SD = 5.03). Of the 116 participants in this condition, 89 were
Caucasian, 17 were African American, 1 was Hispanic, 2 were Asian American, and 7 described
themselves as “other.” Thirty-seven of the 116 participants classified themselves as lower-division
undergraduates with the rest classifying themselves as upper-division undergraduates.
Materials
The pencil-paper materials included a demographic questionnaire, a pretest, a posttest, and an affective
questionnaire. The demographic questionnaire asked each learner to provide background information.
To determine the effects of TiPS instruction on word problem performance, a pretest and posttest was administered
to each student immediately prior to and following instruction. The pretest and posttest was adapted from a set
developed by Derry and her students to evaluate the system. Based on prior research with TiPS (Wortham, 1996),
adult remedial learners typically enter TiPS instruction already performing well on one-step change, group, and
compare (additive) word problems, but not on one-step vary and function (multiplicative) word problems or on
multi-step problems involving both multiplicative and additive schemas. To obtain an instrument that would allow
one to measure the instructional impact and that could be completed in a reasonable time period, two eight-item tests
were created each consisting of four one-step multiplicative word problems (involving the vary and function
schemas) and six multi-step word problems involving both multiplicative and additive schemas. These tests were
based on the tests used in two field studies described previously. Two equivalent forms (A and B) of a test were
developed, each consisting of four one-step, three two-step, and three three-step word problems. With respect to
mathematics operations, underlying concepts, sizes of numbers, and basic grammatical features such as sentence
structure, problems on the two forms were structurally isomorphic to one another. Statistical treatments were
designed to assess whether the instruments perform similarly. One single-step and one two-step problem on each test
were obtained from the National Assessment of Education Progress (NAEP). Test administration was
counterbalanced so that half the students received form A prior to instruction and form B following instruction.
An affective questionnaire was also created that asked each participant to judge the effectiveness of the
instructional program. Specifically, the questionnaire consisted of a set of seven statements to which the participants
responded on a 5-option Likert-type scale from “I disagree” to “I agree”. For instance, the following statements may
be used: (1) “I have learned to solve word problems based on this instruction”; (2) “Learning was fun”; (3) “I would
prefer learning from TiPS when I have to study “mathematized” contents next time; (4) “I felt curious”; (5) “The
examples and problems in TiPS helped me to understand word problems”; (6) “I was interested in learning about
word problems”; and (7) “The instruction in TiPS was well designed”.
Procedure
In order to identify at least thirty students that would fall below the cutoff—and thus be deemed eligible for
TiPS instruction, a large pool of participants were recruited as volunteers from remedial mathematics classes at the
Mississippi State University. As previously mentioned, the selection of the cutoff was made on the basis of a pilot
study. All participants received cash payments for their participation based on their level of participation. During the
Pretest Session, the volunteers were requested to provide consent and then asked to complete the pretest.
Based on the results of the Pretest Session, individuals that scored below the cut-off score were identified and asked
to return for subsequent sessions. As previously mentioned, to help ensure that these individuals were sufficiently
motivated, we provided payments for work completed. Also, at this point, every effort was made to ensure that there
are no intervening relevant instructional experiences, inside or outside of the study.
For the TiPS-based treatment sessions, experimenters followed an invariant data collection protocol that
they were trained to employ. The steps in this protocol included: 1. Preliminary preparation (e.g., readying the
12
computer, ensuring students properly log into TiPS); 2. Administration of TiPS instruction; and 3. Administration of
posttest and affective questionnaire. Since the instruction was self-paced, the actual length of time necessary to
complete the data collection protocol varied across participants. To ensure that the individual sessions during step 2
do not get too long, the participants were not be permitted to work with the system for more than an hour a day.
The students in the comparison group—that is, the students with scores above the cutoff on the pretest—
and the students in the eligible/no treatment group that participated in the Pretest Session were asked to return for
the Posttest Session in which the posttest was administered on either an individual or group basis. This session was
timed to coincide with the last session of the participants exposed to the TiPS system.
Scoring
The protocols generated on the pretest and on the posttest were coded for conceptual scores according to a
set of guidelines for analyzing the written problem-solving protocols derived from research by Derry and her
students (Atkinson & Derry, 2000; Derry, Weaver, Liou, Barker, & Salazar, 1991; Tookey & Derry, 1994). These
guidelines were designed to help gauge where the participant fell along a problem-comprehension continuum.
According to these guidelines, each item will be awarded a conceptual score, ranging from 0 to 3, depending upon
the degree to which the participant’s solution is conceptually accurate. Thus, the overall score for both the pretest
and the posttest ranged from 0 to 30. The cutoff criterion was 40% or a score of 12 on the pretest.
One research assistant who was unaware or “blind” to the condition independently coded each protocol. To
validate the scoring system, two raters independently scored a random sample of 20% of the problem-solving
protocols and agreed on scoring 97% of the time. Discussion and common consent were used to resolve any
disagreement between coders. Once the pretests and posttests were scored, gain scores were calculated designed to
capture any pretest-to-posttest differences.
To create an average affective score, the participants’ responses to all of the questionnaire items were
coded on a scale of 1 to 5. The participants’ responses will then be summed across all of the seven questions and
divided by seven, thereby generating an average response on the affective measure, with values ranging from 1 to 5.
Analysis
The unadjusted pretest and posttest scores for the three conditions appear in Table 1. To determine if there
are learning gains associated with TiPS, the analysis of problem-solving measures consisted of two complimentary
alpha-controlled sets of analyses. First, an analysis of covariance (ANCOVA) was used for testing regression
discontinuity effects (Braden & Bryant, 1990; Cook & Campbell, 1979). We entered pretest scores as the covariate,
entered placement (TiPS group or comparison group) as the independent variable, and designated posttest scores as
the dependent variable in the ANCOVA. With this approach, the difference or discontinuity at the cutting point
between the regression surfaces in the two groups can be taken as evidence of a treatment effect. The interaction
between pretest and placement was also entered to test whether there is or is not a difference in slope between the
two groups (i.e., homogeneity of regression lines).
Second, since it is possible for the ANCOVA to be misspecified such that the shape of the regression
surface is not properly modeled—for instance if there is a curvilinear relationship between the pretest and posttest,
we attempted to exactly specify the true model. When we exactly specify the true model, we obtain unbiased and
efficient estimates of the treatment effect. Our general strategy was to begin specifying a model that we are fairly
certain was over specified. Although the treatment effect estimate for this initial model was likely to be unbiased, it
was also considered inefficient. Through successive analyses, we gradually removed higher-order terms until the
model diagnostics indicate that the model fits poorly. Specifically, the basic model specification analysis for RD
designs involves five steps: (1) transform the pretest, (2) examine relationship between pretest and posttest visually,
(3) specify higher-order terms and interactions, (4) estimate initial model, and (5) refine the model (Trochim, 2001).
We also focused our attention on the participants that interacted with the TiPS system in order to address
the remaining two objectives. For instance, we calculated the instructional time and the number of problems the
students solved during instruction. In addition, we examined the posttest questionnaire for evidence that the
participant was affected by the TiPS instruction.
Finally, we also examined the relationship between those students that were eligible—by scoring below the
cutoff criterion—to participate in the treatment portion of the study but randomly selected to not participate in the
treatment (eligible/non-participants) and the students that were eligible and did participate in the treatment.
13
the figure represents an individual student’s pretest and posttest. The vertical line that appears at the pretest score of
12 on the x-axis represents the cutoff criterion. The dashed lines that appear through the bivariate distributions on
both sides of the cutoff score are the regression lines associated with the TiPS group (on the left of the figure) and
the control group (on the right of the figure). On the basis of a visual inspection of Figure 1, one can perceive a
“jump” or discontinuity in the regression lines at the cutoff point. Specifically, it appears that—on average—the
points to the left of the cutoff (TiPS treatment group) have been raised by approximately 4 points on the posttest.
Although one might conclude from a visual inspection of Figure 1 that TiPS on average raised posttest performance
by 4 points on our scale, we wanted to confirm it by employing an ANCOVA to statically test for the presence of a
regression discontinuity effect (Braden & Bryant, 1990; Cook & Campbell, 1979). First, we tested the posttest for
homogeneity of regression and the results were found to be non-significant—F < 1. Thus, we were able to conclude
that there was no difference in slope between the two groups.
According to the results of the ANCOVA, the adjusted mean scores associated with the posttest for
participants in the TiPS group (M = 20.24, SE = 1.05) were statistically significantly higher than those of their peers
in control group (M = 16.47, SE = 0.50), F (1, 154) = 8.05, MSE = 20.65, p = .005. Cohen's d statistic for these data
yields an effect size estimate of .46, which corresponds to a medium effect. Overall, the results indicate a positive,
practical effect that can be attributed to the TiPS instruction.
Discontinuity
3010
24 8
12 4
…at the cutoff
POSTTEST
62
0
0 2 4 6 8 10
0 6 12 18 24 30
PRETEST
Figure 1. Graph of the pretest scores and posttest scores of the TiPS and control participants. The TiPS group’s
regression line is represented by the dashed line on the left and the control group’s regression line is represented by
the dashed line on the right.
As previously mentioned, since it was possible for the ANCOVA to be misspecified (e.g., the shape of the
regression surface is not properly modeled due to a curvilinear relationship between the pretest and posttest), we
14
attempted to exactly specify the true model by following the steps outlined in the analysis section. We pursued this
model specification process since it we felt that it would help ensure that we would not erroneously conclude the
TiPS treatment made a difference when it in fact did not.
First, we regressed the posttest scores on the modified pretest (SPSS variable = “precut”), the treatment
variable (SPSS variable = “group”), linear interaction (SPSS variable = “linint”), higher order transformation
including quadratic (SPSS variable = “quad”) and quadratic interaction (SPSS variable = “quadint”).
The treatment effect initial estimate is 4.35 (SE = 1.95)—very close to our estimated treatment effect of 4
derived from our visual examination of Figure 1. However, there was also evidence that several of the higher-order
terms were not statistically significant and, thus, were not needed in the model. To refine the model, we dropped the
two quadratic terms.
Final Model
In the final model, the treatment effect and SE were almost identical to the previous model and all of the
terms were statistically significant, indicating that this final model fit the data well and, thus, did not need any
further refinement. This also indicated that there was no evidence of a curvilinear relationship associated with the
bivariate pretest-posttest relationship. Instead, we were able to conclude that a straight-line model, such as the one
assumed in our aforementioned ANCOVA analysis, accurately captures this data. As evidence, this model—like the
other analysis—indicated that the TiPS treatment produced a statistically significant effect, t(154) = 2.837, p = .005.
In fact, the results of our ANCOVA and our model specification process were identical (in a two group situation, t2
= F; thus, squaring our t-value or 2.837 equals 8.05, the F-value we obtained from our ANCOVA).
Beyond providing a basic evaluation of the overall TiPS system, we wanted to determine the amount of
instructional time involved with the use of TiPS system The average time for completing the Tips computer
instruction was 3.84 hours (SD = 0.97) and ranged from 1.83 to 6 hours (excluding test-taking time), during which
time they completed an average of 64.28 problems (SD = 7.68) on the system (overall range from 54 to 80
problems). This instructional time result diverged from the results of the previous field trial where it was found that
the average time for completing the Tips computer instruction was 5.81 hours and ranged from 3.63 to 9.52 hours
(excluding test-taking time). However, unlike the field trial that permitted the student to work on TiPS in session
ranging from three to six hours, the participants in the present study were not be permitted to work on TiPS for more
than an hour a day. This latter type of arrangement may have encouraged the students to use their time more
efficiently on the system.
With regard to our final objective, the affective nature of the TiPS learning experience, we examined how
the TiPS students responded to the affective questionnaire. In response to the statement:
• “I have learned to solve word problems based on this instruction”, 21 out of 41 (51.2%) students agreed or
somewhat agreed.
• “Learning was fun”, 27 out of 41 (61%) students agreed or somewhat agreed.
• “I would prefer learning from TiPS when I have to study ‘mathematized’ contents next time”, 27 out of 41
(61%) students agreed or somewhat agreed.
• “I felt curious”, 23 out of 41 (56.1%) students agreed or somewhat agreed.
• “The examples and problems in TiPS helped me to understand word problems”, 32 out of 41 (78.1%)
students agreed or somewhat agreed.
• “I was interested in learning about word problems” 20 out of 41 (48.8%) students agreed or somewhat
agreed.
• “The instruction in TiPS was well designed”, 31 out of 41 (75.7%) students agreed or somewhat agreed.
15
participants in the TiPS condition and their peers in the eligible/no treatment, t(80) = 2.23, p = .029. The participants
assigned to the Cohen's d statistic for these data yields an effect size estimate of .50, which corresponds to a medium
effect. Again, this result indicates a positive, practical effect that can be attributed to the TiPS instruction, as
opposed to some intervening relevant instructional experiences (inside or outside of the study) that perhaps all of the
students at Mississippi State that performed below the cutoff criterion were exposed to during the course of this
study.
It is also worth noting that, according to the results of an ANCOVA, the adjusted mean scores associated
with the posttest for participants in the eligible/untreated group (M = 18.09, SE = 0.97) were not statistically
different than those of their peers in untreated control group (M = 16.47, SE = 0.50), F(1, 154) = 8.05, MSE =
20.65, p = .29. Moreover, our attempts to specify an analytic regression model in this case did not produce
statistically significant results (see below).
Taken together, this implies that none of the 124 students that scored below the cutoff criterion would have
been able to produce (statistically) significantly higher posttest scores, after adjusting for pretest performance, than
their peers in the untreated control group without the targeted intervention provided by TiPS.
Conclusions
In sum, it is apparent from the evidence compiled in during the present project that remedial learners
engaged in mathematical thinking can benefit on a variety of cognitive (i.e., transfer) and affective measures by
working within TiPS, a computer-based learning environment designed to develop learners’ problem solving skills.
In particular, we empirically documented that learners excelled—after who spent an average of four hours on the
TiPS system were typically rewarded with a 15% improvement—a 1½ letter grade improvement, by conventional
standards—in their problem-solving performance. We attribute this effect to the collection of features inherent to
TiPS, including: (a) instruction within the context of problem solving scenarios, designed to gradually build the
learners skills and abilities that should enable them to reason about complex real-world problems, (b) auxiliary
representations depicted in its problem-solving interface to help learners model and solve problem situations, (c)
worked examples to provide the learners with an expert’s solution, which they can use as a model for their own
problem solving, and (d) the Bayesian student modeler that monitors the learners performance and adapts
instructional delivery to meet their needs. In addition to the enhanced problem-solving performance, the learners
exposed to TiPS reported feeling that the instructional material, including the examples and problems, helped them
to understand how to approach and solve word problems and that, overall, the instructional environment was well
designed.
References
Atkinson, R. K., & Derry, S. J. (2000). Computer-based examples designed to encourage optimal example
processing: A study examining the impact of sequentially presented, subgoal-oriented worked examples. In
B. Fishman & S. F. O’Connor-Divelbiss (Eds.), Proceedings of the Fourth International Conference of Learning
Sciences (pp. 132-133). Hillsdale, NJ: Erlbaum.
Braden, J., P., & Bryant, T. J. (1990). Regression discontinuity designs: Application for school psychologists.
School Psychology Review, 19, 232-240.
Cook, T.D., & Campbell, D.T. (1979). Quasi-Experimentation: Design and Analysis for Field Settings. Rand
McNally, Chicago, Illinois.
Derry, S. J., Tookey, K., Smith, C., Potts, M. K., Wortham, D. W., & Michailidi, A. (1994). Psychological
foundations of the TiPS system: A handbook for system 1.0 (Technical Report). Madison, WI: Wisconsin
Center for Education Research.
Derry, S., Weaver, G., Liou, Y., Barker, J., & Salazar, E. (1991). Inducing flexible transfer in novice problem
solvers: Effects of three instructional approaches. Unpublished manuscript.
Derry, S. J., Wortham, D., Webb, D., & Jiang, N. (1996). Tutorials in problem solving (TiPS): Toward a schema-
based approach to instructional tool design. Paper presented at the annual meeting of the American
Educational Research Association, New York, NY.
Marshall, S. P. (1995). Schemas in Problem Solving. Cambridge, England: Cambridge Press.
Tookey, K. R., & Derry, S. J. (1994). Arithmetic schema and strategy instruction: A study of two problem-solving
approaches. (ONR Technical Report 94-2) Madison, WI: Wisconsin Center for Education Research.
Trochim, W. (2001). Research methods knowledge base. Cincinnati, OH: Atomic Dog Publishing.
Wortham, D. W. (1996). Testing a learning environment based on a semantic analysis of math problems. Master's
Thesis, Department of Educational Psychology, University of Wisconsin-Madison.
16
Blended Pedagogy Research:
Pathways for Developing Problem-Solving Reflective Practitioners
Maria D. Avgerinou
DePaul University
Kathleen Hanlon
Dominican University
Abstract
The purpose of this study was to track and share the journey that teachers make as they evolve as
professionals. Pre-service and in-service teachers were required to participate in reflective practice
activities. Continua of analysis from paper and pencil to face-to- face to online discussions; pre-service entry level
to veteran professionals; loosely structured to highly structured requirements for reflection; and informal to formal
goals for reflective projects are examined via the case studies of four teacher education professionals and the
problem-based field experiences of their students. Research findings indicate that custom design of reflection
opportunities is the best choice for teacher education professors.
Effective teachers engage in reflective practice. The Interstate New Teacher Assessment and Support
Consortium’s (INTASC, 1991, p. 31) ninth core standard for teachers is worded: “The teacher is a reflective
practitioner who continually evaluates the effects of his/her choices and actions on others (students, parents, and
other professionals in the learning community) and who actively seeks out opportunities to grow professionally.”
Mewborn (1999) argued that pre-service teachers need time to learn and practice reflective skills in a non-evaluative
environment. Bullough and Baughman (1997) asserted that the first five to seven years of teaching careers constitute
the novice period; these years should be marked by ongoing reflection, typically in the form of journaling: diaries,
notebooks, dialogues, integrative entries, and evaluative entries (Sileo, Prater, Luckner & Rhine, 1998).
Experienced teachers also benefit from ongoing reflection in similar formats (Bean & Stevens, 2002). For pre-
service or in-service teachers who are reflecting on their teaching practice as they do it, not simply reflecting on a
past experience, reflection typically leads to the solution of specific practical problems (Smith & Hatton, 1993).
Styler and Philleo (2003) recommended the use of technology to enhance reflective journaling. Whipp (2003)
reported on research about teacher candidates engaging in field experiences in urban middle schools in which
teacher candidates engaged in increasingly higher levels of reflection because of online discussions. Rodgers (2002)
proposed four phases in the process of reflection, asserting that reflecting on action becomes practice for the
reflection in action, necessary for teachers who must make decisions and responses on the spot repeatedly
throughout each teaching day.
17
and in-service teachers, and demonstrated where the intersections along the identified continua occur.
18
dynamics and communication; their thoughts, ideas and/or suggestions concerning the literature, their research
project, the instructional modus operandi of the class, etc. Those journals were posted in the group Discussion
Board, that is, an open forum that fosters and supports a collaborative community of learners. Student postings
remained published in the Discussion Board area until the end of the course.
Reflection Goals and Associated Levels of Reflection: The instructor identified four levels of reflection
(Cranton & King, 2003; Mezirow, 1991) which subsequently served as the theoretical framework of the reflective
activities: (a) Content Reflection (description of problem/examination of content); (b) Process Reflection
(evaluating the problem-solving strategies one uses toward resolution of a problem); (c) Premise Reflection
(questioning the problem itself); and Critical Self-Reflection (trying to comprehend why we are doing what we are
doing). These levels of reflection evidently shaped and became inextricably linked to the entire instructional design
of the course, with particular emphasis to identified reflection goals, their implementation and assessment. Through
their engagement in reflective practice activities within the context of this class, students were essentially expected
to demonstrate active and critical engagement with the literature on educational research methodology, its findings,
and applications to own educational context (Content, Premise, & Process); to establish and maintain commitment
toward improving their own collaborative learning skills (Process, & Critical Self-Reflection); to systematically
expand their understanding of instructional technology applications and their potential for research and
communication purposes (Process, Premise, & Critical Self-Reflection); to begin developing action research skills
(Content, Premise, Process, and Critical Self-Reflection); and, to improve their practice as educators through self-
examination and analysis (Critical Self-Reflection).
Reflection Assessment: The online reflective journal carried a weighting of 30% of the student’s overall
grade. The journal was assessed for clarity, organization, and content (depth of analysis).
Data Analysis: In order to address the research questions of this study, a mixed method triangulation design
was adopted. Content Analysis was performed on data (text) collected through the Online Reflection Journals,
Student Formative Evaluation Reports, the Reflective Piece in Student Final Research Papers, and, Student
Summative Evaluation Reports on Faculty and Course. Emerging response patterns & themes were compared and
cross-checked. Furthermore, Descriptive Statistics were generated from (a) the Course and Faculty Ratings yielded
through Student Summative Evaluation Reports, but also from (b) the Grades students earned for their Online
Reflection Journals, as well as their Final Research Papers. Although Inferential Statistical analysis was seriously
considered, it was finally deemed more appropriate for a later stage of the study.
Results yielded through the data analysis procedures were compared, while their common threads are
identified and discussed later in this paper, in relation to the research questions posed.
19
Board: 10% was allotted for the quality of their individual reflective work and 10% for the quality of their peer
interaction. Rubrics were provided to teacher candidates for the assessment of both the quality of their reflective
responses and the quality of their peer responses as were basic guidelines for feedback and use of Blackboard.
Teacher candidates assessed issues/concerns at the mid-point orally in class. Teacher candidates completed a survey
at the end of the course which addressed the use of technology, group interaction online, nature of reflective
responses and ways to improve the project.
20
graduate and undergraduate groupings to examine similarities and differences of reflective abilities between these
groups. Finally, data was organized along the continuum of scaffolding to determine whether candidate growth was
influenced by the addition or omission of prompts.
21
reflective level and 11% or fewer of their comments in each month were simply descriptive. #4. How can more
effective strategies be developed, and how can the conditions for encouraging reflective practice be improved? At
least a third of their reflective comments were analytical in nature. This correlated well with the average score of
3.9 (scale of four) on the indicator of “reflects on one’s practice to improve instruction and guide professional
growth” of a special educator standard. The level of support that they offered to one another was also high. They
provided advice and offered suggestions to solve problems as well as praised one another’s efforts. Approximately
half of the analysis reflections were positive evaluations of a colleague’s strategies or tenacity.
22
focus/problem, questioning the theoretical foundation and applications of action research, questioning aspects of the
instructional design of the class, and; (d) Critical Self-Reflection: questioning their ability to conduct research (early
in the course); seeing and accepting themselves as teacher/researchers in the research class but also in their own
teaching context (as the course progressed), understanding their potential as agents of educational change (end of
course), and questioning their teaching philosophy and methods.
#3. What factors seem to be important in fostering this development? Basic technology knowledge and
skills seemed to be an important factor toward this development. Although at the outset of the course a small
percentage of students (5%) came to experience a steep learning curve technology-wise, at the end they all appeared
to not only have become confident in using Blackboard’s conferences, but they also seemed to endorse the great
educational potential of the platform. A critical factor in changing student attitude about online assignments was the
instructor’s constant help and support toward familiarizing the students with the technology, but also the students’
own perception about the level of their cohort’s (and later on, research group’s) cohesiveness. Since the students
were used in a co-operative learning face-to-face class format, they wanted to extend this opportunity online through
Blackboard’s discussion board. As a result, when weekly reflections were produced and shared through the online
conferences, the students would experience comfort, affirmation, and reassurance for a variety of reasons, the most
important one being the realization that: (a) the nature of everybody’s project was evolutionary; (b) other students
were experiencing similar situations; (c) their project was on the right track; (d) their group was very supportive; (e)
they could share their newly found interest in research; and (f) it was okay to feel frustrated and overwhelmed.
From the teacher’s perspective, cooperative online learning proved consistent with the constructivist course
framework primarily due to the role of social interaction. According to Eggen and Kauchak (2001) learners co-
construct more powerful understandings than individuals can construct alone. This was particularly true in this case,
as students not only were completely novice to the course content, but as is often the case with teachers and
educational research (Labaree, 2003) they had entered the course with negative pre-conceptions about educational
research, as well as their ability to engage with any aspect of it. Nevertheless, cooperative learning online did not
allow for the student biases and/or sentiments of stress and dejection, to grow and taint further student understanding
of research. To the contrary, the students were able to transfer online their sense of belonging to this community of
learners who explored new ground and eventually built together a solid understanding of the theories and concepts
of educational research. And, perhaps most importantly, through this constructive learning environment students
gained confidence and pride in their own research abilities. Besides, as Avgerinou and Carter (2005) point out “In
keeping with the philosophy of action research, collaboration within a learning community is one of the fundamental
skills that need to be developed signifying that the researcher has been fully immersed in the process” (p. 27).
An important aspect of this introductory research course was the requirement of the student applying
critical thinking and problem solving skills in identifying and selecting an action research focus or problem upon
which they also had to conduct a preliminary literature review. In other words, not only did students need to identify
their focus of investigation, but they also had to seek, find, and evaluate various information sources that related to
it. Daunting as this task may seem especially if required of novice researchers in such a short period of time, all
students accomplished this goal promptly and successfully. Data analysis results confirmed that students’
explanation as to why and in what ways they were able to arrive at producing high quality coursework, was
powerfully co-related with their weekly reflections. As stated above, students were strongly encouraged to reflect
upon, and evaluate their progress throughout the life of the course. Moreover, the fact that weekly reflections were
an integral part of the course assessment, increased student motivation to do well on the course. This aspect of the
course set up apparently allowed them to pace the work accordingly, create realistic deadlines, and meet them
successfully. Students also reported that frequent and constructive feedback provided by the instructor, as well as
her expertise, passion and enthusiasm for action research, were great motivators for them to meet the instructor’s
high expectations particularly as far as practicing self-reflection was concerned. Interestingly, students identified as
one of the course strengths that they were constantly challenged to question assumptions, and to make sound
research decisions.
#4. How can more effective strategies be developed, and how can the conditions for encouraging reflective
practice be improved? A few students commented that online written reflections in combination with oral
reflections at a relaxed, not-class-like setting, would have helped them improve further their reflective ability. This
is an important recommendation especially as far as the identified conditions are concerned. Apparently, a rigid
classroom setting which does not allow for co-operative group work and open discussions, does not lend itself
readily to articulating oral reflections, let alone sharing them with anyone else other than the instructor. Conversely,
a truly constructivist, cooperative, dialogue-based classroom is an ideal environment for student debating and
critiquing content and process aspects of the course, as well as their own role and growth in it.
23
Professor B: Case Study Conclusions
In conclusion, the preliminary results of this study seem to indicate that the online reflective journals were
successful as a vehicle of propelling, supporting, and enhancing student understanding and growth of both their own
teaching practice, and action research skills. All levels of reflection, namely Content, Process, Premise, and Self-
Reflection, were demonstrated by the overwhelming majority of the participating students at various points in the
course. Basic technology literacy and skills, as well as sound pedagogical appreciation of both f2f and online
manifestations of co-operative learning and the overarching constructivist framework of the class, seemed to be the
most important factors in fostering this development. Integration of weekly reflections into the course assessment
policy, but also continuous feedback by the instructor, were also reported by the students as strong motivators
toward practicing and further advancing their reflective skills. Finally, students indicated that online written
reflections when followed by oral reflections at a relaxed, not-class-like setting, could help improve further their
reflective ability as this relates to their research project, but also to their daily teaching practice.
24
focus attention during tutoring/observation.
The instructor rated the quality of teacher candidate reflective work as less thoughtful and less critical than
did many of the candidates. Late posting and not posting reflections for the group to respond to was the most
frustrating aspect of the small group activity. One field site where a large number of candidates were assigned was
often critiqued as problematic for a number of reasons. Participants recommended that this site not be used for
tutoring in the future.
Forces that may contribute to strong reflective work and positive change include: increasing comfort with
electronic medium, positive impact of non-judgmental peer support and feedback, models of reflection provided by
group members, structured prompts that provide focus and the practice of reflection over time, strong writing skills.
In similar fashion, forces that may have contribute to consistently weak reflective work include: preference for face-
to-face interaction and discussion, comfort or familiarity with medium, personal access to a computer, lack of
instructor feedback or accountability, mismatch between site context and reflective expectations, group dynamics,
limiting prompts and weak writing skills.
25
acquisition of professional competencies. One, in fact, failed to complete the student teaching experience.
Those who progressed through the stages of the reflective cycle were able to make the paradigm shift to
seeing their teaching as a response to student learning rather than a cause of student learning. Similarly, those
candidates who grew in reflective abilities as the semester progressed also increased the frequency and depth of their
online communications. While a number of participants indicated the benefits of face-to-face discussions in seminar
sessions, the opportunity to collaborate asynchronously between seminar sessions was seen as providing the
continuing support of a community of learners.
Data indicating that the paper/pencil exercises failed to support candidate progression through the reflective
cycle raises questions regarding the benefit of such a time-consuming exercise. In addition, this same data raises
questions regarding the benefits of scaffolding provided by the supervisor verses the scaffolding provided by peers.
The frequency and depth of small group asynchronous discussions casts doubt regarding the benefits of the grade
level and large group discussion board activities. Finally, exit survey comments regarding the benefits of reflection
indicate a need to provide better instruction and more opportunities for reflection in a community of learners during
this important phase of teacher preparation.
General Conclusions
In addition, there are some overarching conclusions. First, taking teachers seriously at every level of
preparation (in-service and pre-service) and permitting choice as to when to reflect as well as some choice in the
topics or questions for reflection appears to be critical. This is in accord with the work of Day (1999) who noted the
need for teachers to be permitted to develop as professionals on the basis of their concerns, taking into account the
moral purposes of teachers. This is true because teaching is a profession in which feelings and emotions play an
essential role (Hargreaves, 1998). The prompts all provided teacher candidates and candidates with the stimulus to
reflect and include their emotions about their experiences.
It also appears that it is critical for those who reflect to receive a response. In some of the cases described,
the responses came from the professors. In all of the cases described, responses came from colleagues/classmates in
the context. It seemed important that ideas were “heard” and understood. Further, it appeared that writers tended to
extend their own thinking in response to those who commented on their original postings. Teaching and learning
has a social component and is not an isolated act.
In particular, custom design of reflection opportunities appears to be the best choice for teacher education
professors. This is especially true when those custom designs are based on the instructional design (particularly
objectives) and delivery of the course, strengths as well as specific needs of the teacher candidates and their
instructors, and the ongoing call for thoughtful reflection in a “people-based” profession where infinite variables
continue to influence effectiveness.
References
Avgerinou, M.D., & Carter, C. (2005, April). Blended collaborative learning as a vehicle for developing in-service
teachers’ action research skills. Manuscript submitted for publication.
Bean, T., & Stevens, L. P. (2002). Scaffolding reflection for pre-service and in-service teachers. Reflective
Practice, 3(2), 205-218.
Bullough, R. & Baughman, K. (1997). First year teacher eight years later: An inquiry into teacher development.
New York, NY: Teachers College Press.
Cranton, P., & King, K.P. (2003). Transformative learning as a professional development goal. New Directions in
Adult and Continuing Education, 98, 31-37.
Day, C. (1999). Developing teachers: The challenges of lifelong learning. London/Philadelphia: Falmer Press.
Dewey, J. (1944). Democracy and education. New York: Free Press. (Original work published 1916)
Eggen, P., & Kauchak, D. (2001). Educational psychology: Windows on classrooms (5th ed.). New Jersey, OH:
Pearson Merrill Prentice Hall.
Hargreaves, A. (1998). The emotional practice of teaching. Teaching and Teacher Education, 14(8), 835-854.
INTASC (1991). Model standards for beginning teacher licensing and development, Interstate New Teacher
Assessment and Support Consortium. Washington, DC: Council of Chief State School Officers.
Labaree, D.F. (2003). The peculiar problems of preparing educational researchers. Educational Researcher, 32(4),
13-22.
Mewborn, D. (1999). Reflective thinking among pre-service elementary mathematics teachers. Journal for Research
in Mathematics Education, 30(3), 316-341.
Mezirow, J. (1991). Transformative dimensions of adult learning. San Francisco, CA: Jossey-Bass.
Miles, M. & Huberman, A. M. (1994). Qualitative data analysis. Thousand Oaks, CA: Sage Publications.
26
Rodgers, C. R. (2002). Seeing student learning: Teacher change and the role of reflection. Harvard Educational
Review, 72(2), 230-253.
Smith, D., & Hatton, N. (1993). Reflection in teacher education: A study in progress. Education Research and
Perspectives, 20(1), 13-23.
Sileo, T. W., Prater, M. A., Luckner, J. L., & Rhine, B. (1998). Strategies to facilitate pre-service teachers’ active
involvement in learning. Teacher Education and Special Education, 21(3), 187-204.
Styler, G. M., & Philleo, T. (2003). Blogging and blogspots: An alternative format for encouraging reflective
practice among pre-service teachers. Education, 123(4), 789-797.
Whipp, J.L. (2003). Scaffolding critical reflection in online discussions: Helping prospective teachers think deeply
about field experiences in urban schools. Journal of Teacher Education 54(4), 321-333.
27
Readiness and Willingness of the Learners in Traditional Distance Higher
Education Programs of Turkey for Online Learning
Abstract
This presentation intends to reveal the results of a study in which readiness and willingness of the learners
in traditional distance higher education programs of Turkey for online learning were investigated. It might
especially be beneficial for those who would like to learn the ways of integrating online technologies into traditional
distance learning courses and what kinds of online learning opportunities Anadolu University provides its nearly
1.200.000 distance learners. The presentation also summarizes the results of a study in which traditional distance
learners’ readiness and willingness for online learning were investigated.
Introduction
Although controversies over the effectiveness of online distance education continue, the number of online
programs and courses has increased dramatically over the last decade. Almost all higher education institutions have
launched or started to think of launching online distance education programs mainly in order to be able to reach
more students with fewer expenses (Duffy & Kirkley, 2004).
On the other hand, effectiveness of the online distance education programs relies on different variables.
Learners’ readiness and willingness are usually listed among these variables. Experts agree on the importance of the
learners’ readiness for online learning. For instance, Guglielmino and Guglielmino (2003) state that learners’
readiness makes an online learning initiative whether an efficient, effective and economical approach or a
frustrating, time and resource wasting attempt. Since learners’ readiness is crucial factor, many online learning
providers try to determine the likelihood of prospective students’ success in order not to fail in their initiative.
Majority of these efforts include asking students to take a pretest –usually in an online form- to see if they are likely
to fit in an online program. These tests commonly consist of items aims to measure learners’ access opportunities to
computers/Internet and their computer/Internet competences.
However, access and technology experience are not the only factors that can be used as predictor of success
for online learning. Hiltz and Shea (2005) note that what the learners bring to the learning situation heavily influence
the success in any mode of learning and there is evidence that learners with high motivation, greater self-regulating
behavior and confidence in their ability to use computers and to learn online do better than those who lack these
characteristics. Self-efficacy is one of these characteristics has been considered as a construct to predict success in
online learning.
Albert Bandura introduced self-efficacy theory to the scientific community. Bandura (1978) considers self-
efficacy as “a judgment of one’s ability to execute a particular behavior pattern” (p.240). According to this
definition self efficacy beliefs form a central role in regulatory process through which an individual’s motivation
and performance attainments are governed (Wood & Bandura, 1989). On the other hand, self efficacy judgments or
beliefs determine how much effort people will spend on a task and how long they will persist with it. Studies have
revealed that self-efficacy is a strong predictor of academic performance in traditional face-to-face classrooms.
Multon, Brown, and Lent (1991) reviewed a list of studies that examined self-efficacy in achievement situations.
They found that self-efficacy beliefs were positively related to academic performance. Similarly, Ames (1984) and
Nicholls and Miller (1994) found that students' self-perceptions of ability were positively related to achievement and
student motivation. Increasing number of researchers (e.g. Joo, Bong & Choi, 2000; Lee, 2002; Pajeres, 1996;
Schunk, 1994) have been interested in the roles and effects of self-efficacy in online learning. Results of the studies
are confusing. Some studies, such as Lim (2001), suggest a positive relationship between computer self-efficacy and
success and/or satisfaction in online learning while some others, such as DeTure (2004) conclude online
technologies self-efficacy as poor predictor of learner success in online distance education courses.
28
Purpose and Research Questions
This study intended to examine the readiness and willingness of learners who study in traditional distance programs
for online learning in Turkey. The research questions are formulated as:
1. What is the extent of the learners’ readiness for online learning according to their self-efficacy levels for
using online tools?
2. What is the extent of the learners’ willingness for taking their courses completely online?
3. How often do the learners use online support services provided by Anadolu University?
4. Is there a difference between female and male learners’ readiness and willingness for online learning?
Methodology
This study was conducted in Anadolu University, well-known with its distance programs. A three-part
questionnaire was used to collect data from the learners who participate the voluntarily face-to-face pedagogical
support service provided by the university during the after work hours. Following section includes details about the
participants, setting and the instrument.
Participants
Anadolu University provides several support services to its distance students. The face-to-face, after-work-
hours classes are considered as the essential pedagogical support service of the University. According to recent
numbers, approximately 20,000 learners use this service through out the country. In Eskisehir where the University
located around 1,500 learners join these classes every year (Cekerol, 2005). This study was intended to collect data
from those learners attending the Anadolu University Open Education Faculty evening classes in Eskisehir. A total
of 1,120 questionnaires were distributed to the learners but only 269 wanted to take a part in the study (24 percent).
Of these participant learners, 161 (60 percent) were females while 104 (40 percent) were males along with 4 missing
data.
Setting
This study was conducted at Anadolu University. According to the World Bank, Anadolu University is the
world’s largest university due to its student body (Potashnick & Capper, 1998). The University actually is not an
open university. It has a dual mode education system. The on-campus education is offered though its 9 colleges (or
faculties, “faculty” is a term used in Turkey instead of “college” or “school”), 10 vocational schools, 18 research
centers and the state conservatory (school of music and theatrical acting). The distance education programs are
organized under three faculties: Open Education, Business Administration, and Economics.
Anadolu University was established in 1981 from an older institution, the Academy of Eskisehir,
Economics and Commercial Sciences (EAECS). In accordance with the Higher Education Act of 1981, it was also
authorized to provide distance education in Turkey on a national scale. As a result in 1982 the former Faculty of
Communication Sciences of the EAECS was transformed to become the Faculty of Open Education, or, as it is
called commonly, the Open Education Faculty (OEF). This faculty was an outgrowth of the newly established
Anadolu University because at that time, it was the only institution that had experience in the technical and
theoretical aspects of distance education. The first educational television pilot project of Turkey was undertaken
here during the 1970's under the auspices of the Educational Television department of the EAECS (McIsaac,
Murphy & Demiray, 1988).
In the 1982-1983 Academic Year, the OEF started to offer two, four year undergraduate distance education
degree programs in Business Administration and Economics. That year 29,478 students enrolled in the programs. By
2004-2005, the number of enrolled distance students at Anadolu University reached approximately 1 million. Today,
the OEF, along with other two distance education faculties, is offering 8 different BA degree and 22 associate
degree programs to students in Turkey, the Northern Cyprus Turkish Republic and some of the European countries
such as Germany, Netherlands, and France. The programs vary from Business Administration to Pre-school Teacher
Education. Recent figures show that majority of the distance learners of the University have jobs (78 percent).
Among these students 30 percent live in villages and small towns, 62 percent are over 24 years old, and 45 percent
are married. Moreover, 40 percent of them are female.
The distance programs of Anadolu University are primarily textbook-based and require self-study. In other
words, students are expected to study their textbooks at their own pace, alone, and to take scheduled centralized
exams administered at remote locations. Textbook-based instruction is also supported with several services
including broadcast television programs aired by a state channel throughout the country, video and radio programs
distributed on cassettes, CDs or DVDs, remote evening classes, and computer-supported learning environments. The
29
rationale behind this sort of an instructional approach is common to all open and distance learning initiatives in
emerging countries. These are based on (1) outreach to as many learners as possible in cost effective ways, and (2)
providing alternatives for learners’ limited access to the other technologies including VCRs, computers and even
television broadcasts. Figure 1 reveals that distance learning is a necessity for Turkey rather than a convenience
owing to the shortage of higher education institutions and the increasing demand for education. Since printing and
mailing do not cost as much as advanced technologies, Anadolu University is able to accept thousands of learners
every year into its programs. In addition, recent figures show that the majority of distance students cannot access
computers and other technologies, despite the improvements in technology distribution. For instance quite a number
(30 percent) of the current distance learners of the University live in rural areas where they have difficulties
receiving television broadcasts, especially the channel that airs the University’s programs. Also, the percentage of
students who own a computer and have Internet connection at home is even lower. This situation is related to the
home computing ratio in Turkey. In general Figures show that only around 12.5 percent of the population has
computers at home and only 7 percent have an internet connection (TUBITAK, 2000). Although the number of
students who are able to access the Internet at work or in Cyber Cafés is growing, students are having difficulties
(such as heavy work conditions and high costs) using the Internet for learning. Thus, the majority of Anadolu
University’s distance programs are still textbook-based. The number of learners in online (only 2 percent of all
learners) and hybrid (10 percent) programs is quite limited despite the improvements.
Currently there are about 400 textbooks used at the programs. All are designed and produced in-house. The
University has modern printing facilities where these textbooks and other print materials are printed. The total
number of textbooks printed at the beginning of 2004-2005 academic year was more than 3 million copies.
Anadolu University has also its own television program production and broadcasting facilities. Around
4,300 television programs have been designed and produced in the Educational Television Centre (ETV) since it
was first establishment in 1982. The centre, supported with editing and post-production units, has two production
studios and a mobile production vehicle. Currently 165 technical and administrative staff is employed in the centre.
The University just recently launched its own television channel, entitled TVA. However, it is a local channel and is
not eligible to air the programs according to the legislations. So, the University broadcasts nationwide six hours of
programming everyday (total around 900 hours of broadcasting every academic year) on Channel 4 of the Turkish
Radio and Television (TRT4) Corporation. Learners in Turkey may also acquire videocassette or VCD/DVD
formats of these programs with a minimum charge for shipping expenses, while those in Europe get this service free
owing to the fact that they cannot watch TRT4 where they are. In addition, Anadolu University in collaboration with
the TRT4 offers live broadcasting three times in a year just before the centralized exams. During those live
broadcasts, learners may reach the instructors in the studios via phone and ask their questions.
For certain courses, academic support is also provided via face-to-face lecture sessions. The University has
agreements with local universities (currently 38 universities) in 59 provinces of the country to hire their personnel
and facilities to offer these lectures to its learners. The lectures are given during the evenings (after work hours) and
weekends. Every year approximately 20.000 learners regularly attend those lectures.
Student success is determined by multiple choice tests. Each academic year, a mid-term, a final and a
make-up exam are centrally administered to the students to evaluate their performance in the courses. The weights
of these tests for the final grade are 30 percent and 70 percent. An average score of 50% is required in order to
"pass" a course. The students who fail are given an opportunity to recover their final test score at a make up exam.
Exam papers are graded by computer and the results are delivered either by mail or through the Internet. The Centre
for Research in Testing of the University is responsible for the preparation and maintenance of a question data bank
for the exams. Tests are prepared at this centre by a joint committee of authors/editors, field experts, technical
consultants and scientific assessment specialist. Those scheduled exams are administered in 88 provinces in Turkey
and 11 centers in Europe. The University usually uses 55,000 classrooms in 4,000 buildings and hires 50,000
personnel (local teachers, school staff and administrators, transporters, etc.) to administer the exams.
Furthermore, Anadolu University provides administrative support to its distance learners through its 84
offices in 77 provinces of the country. Those offices are run by the University’s own staff (total 335 staff) and
almost all the properties of the offices are owned by the University. In addition, learners may reach the University
via email and phones to receive help for their administrative and technical problems. In terms of social support, the
University encourages the learners to attend graduation ceremonies and local events organized by the administrative
offices. Moreover the University has an online weekly newspaper that gives news and recent developments in the
University.
On the other hand, critics of Anadolu University’s distance programs essentially focus on its centralized
structure and the number of the learners (e.g. Cagiltay, 2001). The assessment system, fixed programs, lack of
interaction, and widely felt sense of isolation are the most frequently criticized aspects of the programs. Also, the
30
credit given to the distance learning programs is still low in Turkey. Askar (2005) reports the results of a study in
which she interviewed learners and faculties about distance learning. She found out that both learners and faculties
regard distance learning as a chance for those who have no other options, and that it is not an alternative but a
supplement to conventional universities. Moreover, high drop out rates as a common characteristic of distance
education learners prevails also in Anadolu university distance programs. The overall drop out rate of the system is
40 percent and most of the dropouts are observed in the first year of the study.
Anadolu University takes these or similar criticisms into consideration and so, has always been in search of
bringing new technology into its programs. Therefore, since early 1990s, the university has been trying to integrate
computers and computer-based technologies into its distance programs. As a result of this continued search,
authentic ways of technology utilization for distance learning have been generated over time. For instance, the
university has been providing opportunity to its distance learners to email their questions to the content experts.
Additionally, since early 1990s, the learners have been able to use multimedia programs produced by the Computer-
Based Instruction Centre of the University. During the first years, these programs were delivered via CDs but
currently they are on Internet available to all learners who have valid student IDs. Furthermore, since 2000 the
learners may choose to take online self-tests in order to learn how they are ready for the exams.
Instrumentation
A three part questionnaire was developed to gather data about the research questions. The first part
included questions about demographic characteristics of the learners. The second part helped to learn both frequency
of the learners’ online services usage and their willingness to take their courses online. The last part contained the
Self-efficacy for Online Technologies instrument originally created by Miltiadu. The Self-efficacy for Online
Technologies instrument items were divided into four categories, each of which represented one subscale (1) web
surfing, (b) synchronous interaction, (c) asynchronous one-to-one interaction, and (d) asynchronous one-to-many
interaction. This instrument was first adapted and translated into Turkish. Later, it was checked and corrected by one
language expert and two educational technology experts, both of whom had graduate degrees earned in the States so
that have enough English competency along with knowledge about the field of educational technology. No data was
available concerning the reliability of the Turkish version of the instrument prior to the study. The reliability
coefficient was calculated, however, after the administration of the instrument. The Likert type instrument consisted
of 30 items. Each statement was preceded by the phrase “I feel confident…” For each item, students were asked to
indicate their attitude from “Strongly Disagree”, “Disagree”, “Neural”, “Agree”, to “Strongly Agree.” The 3.41
mean score identified as the expected level of self-efficacy with the item while other responses enabled learners to
show higher or lower levels of self-efficacy. The 3.41 mean average was determined after identifying the critical
level: 4 intervals/5 categories = 0.8.
31
Readiness for Online Learning
The first research question concerned about extent of the learners’ readiness for online learning. The
Miltiadu’s Self-efficacy for Online Technologies instrument used to gather data for this question.
Table 1 illustrates the overall mean scores of the participants’ responses and the mean scores of items related to each
subscale. From the table, it can be observed that the overall mean score is slightly higher than the expected level of
readiness (Mo=3.46 > Mer =3.41). Based on this result, one can infer that learners in the traditional distance
programs, within the limits of the learners surveyed, are barely ready for online learning and definitely need to
improve their online technology skills. They especially need to improve their asynchronous one-to-many interaction
skills because the mean score for this subscale is not only the lowest but also lower than expected level of readiness
(MasOne2many=3.46 < Mer =3.41). Since collaboration among learners have been considered one of the essential
components of an effective online learning course (Carr-Chellman & Dushastel, 2000), this skill can be considered
as being very important to be able to be a successful online learner. So, those learners who are planning to attend
online courses should improve their online discussion skills and the administrators should find ways to help learners
on this issue in Turkey.
Missing
Not Sure
18.90 / 17.8%
26.38 / 24.9%
No
32
Figure 1: Learners’ willingness for online learning
Table 2 Descriptive statistics for learners’ frequency of using online support services
Frequencies
Online Support
Services
Never Barely Sometimes Often Always
As can be observed in Table 2, the majority of the learners do not often use some of the online services, such as
using e-mail to interact with course coordinator(s), taking online self-evaluation tests, and studying the multimedia
learning materials on the net. However, quite a number of them frequently use the Internet to learn their exam
results.
33
orientation of these learners. Unfortunately, earning a diploma is more appreciated than learning new knowledge,
skills and attitudes for the majority of the distance learners. So, these learners focus on passing the exams rather than
learning. Also, this result can be inferred as that building is not enough. The administrators and designers should
also think of components that encourage learners to get benefit of these voluntary support services.
On the other hand, the learners’ access opportunities to online technologies have increased over the last
year. Additionally, more talented learners are entering the distance learning. Maybe, now the number of the distance
learners who use the online services is also raised. So, it might be beneficial to repeat this sort of a study
continuously to get a feedback for the implementations of the University. Also, conducting with a larger group of
learners may also provide better insight about the issue.
References
Ames, C. (1984). Achievement attributions and self-instructions under competitive and individualistic goal
structures. Journal of Educational Psychology, 76, 478-487.
Askar, P. (2005). Distance education in Turkey. In C. Howard, J. Boettcher, L. Justice, K. Schenk, P. L. Rogers, G.
A. Berg (Series Eds.), Encyclopedia of distance learning: Vol. 2 (pp. 635-640). Hershey, PA: Idea Group
Reference.
Bandura, A. (1978). Reflections on self-efficacy. Advanced in Behavioral Research and Therapy,1(4), 237-269.
Cagiltay, K. (2001). Uzaktan eğitim: Başarıya giden yol teknolojide mi yoksa pedagojide mi? [Distance education:
Does the road to success in technology or in pedagogy?] Retrieved at February 2, 2005 from
http://www.teknoturk.org/docking/yazilar/tt000037-yazi.htm
Carr-Chellman, A. & Dushastel, P. (2000). The ideal online course. The British Journal of Educational Technology,
31(3), 229-241.
Cekerol, K. (2005). Anadolu Üniversitesi Açıköğretim Fakültesi Akademik Danışmanlık Dersleri [Anadolu
University Open Education Faculty Academic Support Courses]. A paper presented in the 5th International
Educational Technologies Conference, Sakarya, October 21, 2005.
DeTure, M. 2004. Cognitive style and self-efficacy: Predicting student success in online distance education. The
American Journal of Distance Education, 18(1): 21-38.
Duffy, T. & Kirkley, J. (2004). Introduction. In Duffy, T & Kirkley, J (Eds.) Learner-centered theory and practice
in distance education: Cases from higher education (pp: 3-13). Mahwah, NJ: Lawrence Erlbaum.
Guglielmino, P.J., & Guglielmino, L.M. (2003). Are your learners ready for e-learning? In G.M. Piskurich (ed). The
AMA handbook of e-learning: Effective design, implementation, and technology solutions (87-98). New
York: AMACOM.
Hiltz, S. R., & Shea, P. (2005). The student in online classroom. In Hiltz, S. R., & Goldman, R. (Eds.) Learning
together online (145-168). Mahwah, NJ: Lawrence Erlbaum Associates.
Joo, Y., Bong, M. & Choi, H. (2000). Self efficacy for self-regulated learning, academic self-efficacy and Internet
self-efficacy in web-based instruction. Educational Technology Research and Development, 48(2), 5-17.
Lee, I. S. (2002). Gender differences in self-regulated on-line learning strategies with Korea’s university context.
Educational Technology Research & Development, 50, 101-111.
Lim, C. K. (2001). Computer self-efficacy, academic self-concept, and other predictors of satisfaction and future
participation of adult distance learners. American Journal of Distance Education, 15 (2), 41-51.
McIsaac, M. S., Murphy, K., & Demiray, U. (1988). Examining distance education in Turkey. Distance Education,
9(1) 106-114.
Miltiadou, M.,& Yu, C.H. (2000). Validation of the online technologies self-efficacy scale (OTSES). International
Journal of Educational Telecommunications.
Multon, K. D., Brown, S. D., & Lent, R. W. (1991). Relation of self-efficacy beliefs to academic outcomes: A meta-
analytic investigation. Journal of Counseling Psychology, 38, 30-38.
Nicholls, J., & Miller, R. (1994). Cooperative learning and student motivation. Contemporary Educational
Psychology, 19, 167-178.
Pajeres, F. (1996). Self-efficacy beliefs in academic settings. Review of Educational Research, 66, 543-578.
Potashnik, M. & Capper, J. (1998) Distance education: growth and diversity. Finance & Development 35(1).
Retrieved November 5, 2003, from http://www.imf.org/external-/pubs/ft/fandd/1998/03/index.htm
Salomon, G.(1984). Television is “ Easy” and Print is “Tough”: The Differantial Investment of Mental Effort in
Learning as a Function of Perceptions and Attributions. Journal of Educational Psychology,76(4), 647-658.
Schunk, D.H. (1994). Self-regulation of self-efficacy and attributions in academic settings. In D.H. Schunk & B.J.
Zimmerman (Eds.), Self-regulation of learning and performance: Issues and educational applications (pp.
75-99). Hillsdale, NJ: Erlbaum.
34
TUBITAK-BILTEN (2000). Bilgi teknolojileri yayginlik ve kullanimi arastırma raporu [Report on diffusion and
use of information technologies]. Ankara, Turkey.
Willis, B. (1993). Distance education: A practical guide. Englewood Cliffs, NJ: Educational Technology
Publications
Wood, R. & Bandura, A. (1989). Social Cognitive Theory of Organizational Management. Academy of management
Review,14(3), 361-384.
35
The Usage of Internet Communication Technologies by Low-Income High
School Students in the Completion of Educational Tasks
Inside and Outside of the School Setting
Eun-Ok Baek
Seth Freehling
California State University San Bernardino
Abstract
This qualitative research study examined the use of Information and Communicative Technologies (ICT)
complete assigned and unassigned homework outside of the traditional high school setting by high school students
from economically-disadvantaged households. Student participants in the interview phase ascribed their choice of
using ICT to be attributed to the ease of use and the perception that utilizing ICT as a resource resulted in tasks
being accomplished at a quicker rate. The multi-tasking capabilities of computer technologies were found to be the
key facilitating as well as hindering factor in student use of ICT outside of the school. The use of ICT as a
supplemental resource in the completion of traditional homework assignments was a notable pattern in the findings.
Introduction
With the increasing prevalence of Internet access found within the modern classroom and the more recent
trend in households of students from low-income backgrounds having access to online resources, the ability to
exploit this powerful educational tool is within the reach of many students inside and outside of the classroom. The
trend of declining computer prices by an average of 16.5 percent annually (Nomura & Samuels, 2004) has allowed
many students from economically disadvantaged backgrounds to bridge this divide and take advantage of online
access from within their households.
Gaining an understanding of how high school teenage students use the Internet as a resource in order to
complete homework assignments will generate data that can be used by the classroom teacher in order to design
independent out-of-class assignments that support well-designed curriculum-based lessons. Moreover, an
understanding regarding the support requirements that are required by the student when using online resources at the
household will help gain maximum usage out of a student’s instructional time outside of the class. Therefore, it is
alarming that those students who have this resource available at school and at home are performing worse than peers
that do not have computers at home (Woessmann & Fuchs, 2005).
Like its instructional media predecessors, a contrast exists between the anticipated benefits of Internet
technology and the actualized benefits this technology is currently having upon educational practices (Reiser, 2002).
With this in mind, the purpose of this project is to develop an in-depth portrait of the home Internet usage of high
school students for the educational task of the completion of school assignments. In an effort to achieve the means to
the development of strategies and resources along with justifying the proposal of offering school-based computer
help, this study addresses three questions:
What is the frequency and contributing factors leading high school students to choose the Internet as an
instructional resource in order to complete class assignments at their home?
What are the usage patterns of students using the Internet to complete assigned and non-assigned schoolwork
outside of the traditional school classroom setting?
What are the factors that influence students’ home Internet usage in the completion of assigned tasks?
Specifically, what are the factors that facilitate or hinder high school students from the completion of educational
tasks when utilizing the Internet at home?
Theoretical Background
The Narrowing Digital Divide
The term Digital Divide was coined in the early to mid-1990s in response to a large portion of the
population not having computers and subsequent Internet access. This divide was attributed to the large cost
associated with computer ownership and the infancy of Internet access that resulted in much content aimed primarily
at scientists and researchers (Wilhelm, Carmen, & Reynolds, 2002). Presently, with the low-cost of computer
ownership and the growing build-up of computer hardware in public schools, current government literature argues
36
that this Digital Divide between the economic classes has become increasingly insignificant. This can be judged by
comparing the titles of the U.S. Department of Commerce, Economics and Statistics Administration, and National
Telecommunications and Information Administration’s October 2000 report entitled: Falling through the Net:
Toward Digital Inclusion with the February 2002 report: A Nation Online: How Americans are Expanding Their
Use of the Internet. The United States is not the only culture bridging the digital divide. Woessmann and Fuchs’
(2004) research involving responses from 32 developed and emerging countries found that 43 percent of students
polled had access to the Internet at home.
Computer ownership, specifically with Internet access, at home helps the student academically in a
multitude of ways. Levin and Arafeh (2002) found that students who have access to the Internet at home come to
rely on the technology to help them do their schoolwork and complete tasks more quickly. The researchers also
found that students get less stymied by material they do not understand, and students write papers and reports that
draw upon up-to-date sources. Students in Levin and Arafeh’s study were also observed as greatly utilizing the
Internet as a form of communication with other classmates. The students in the study were found to correspond with
other classmates about homework, quizzes, and share websites that were helpful in their studies. Further benefits of
home computer use can be seen by students getting better grades, watching less television and spending more time
online (Landgraf, n.d.).
37
teenagers are more Internet savvy than adults is being questioned. A recent Nielsen study (Baig, 2005) found that
teenagers completed specific tasks at lower percentages than adults. In the study, teenagers were only able to
complete tasks such as making a DMV appointment and finding concert dates 55 percent of the time, compared with
a 66 percent rate of completion by adults given the same task.
Methodology
Participants
This qualitative study was conducted at a secondary school located in a mid-sized city in Southern
California. The secondary school is a fairly large, urban high school located in a low-income, older neighborhood
with households reflecting a mix of immigrant, transient, and working class residents. The participants of this study
were polled from The California Partnership Academy (CPA) programs. This program is a college preparatory
course of study, differing only in the elective class chosen by the program administrator. It was established at the
high school in 1991. The CPA programs are a state, grant-funded, school-within-a-school concept fashioned after
the successful Philadelphia career academies. The primary purpose of this program is to offer students career-
oriented training through a three-year sequence of elective courses in the student's career field of interest.
The pool of students for this study consisted of over 248 students enrolled in one of the two smaller
learning communities. Students in the Public Safety small learning community consisted of students interested in
police work, firefighting, or paramedic careers. These students were scheduled together throughout the school day in
the same core academic content subjects. The students in this small learning community received no specific ICT
training from the Public Safety teacher and were not given any ICT specific assignments by the teacher. The second
smaller learning community consisted of students from the Business Academy. Students enrolled in this program
received extensive ICT training and were required to daily utilize these skills in their business elective class.
Students in this second small learning community were interested in pursuing careers in business administration and
worked toward receiving certifications in Microsoft Office Software products. Participants from both smaller
learning communities consisted of sophomores (grade 10), juniors (grade 11), and seniors (grade 12). Freshmen
were excluded from this study as students enter the CPA program at the beginning of their sophomore year and
remain in the program until graduation as a senior.
The student population in this study had varying degrees of access to ICT during the traditional school day.
The high school campus has a robust, reliable computer network. The school’s Internet access is serviced by two T-
1 lines, and the campus has arranged the network in a VLAN in order to decrease network traffic and boost speeds.
Every classroom and administrative office at the school has access to the Internet. Actual student access to ICT
within the classroom varies from classroom and teacher. Whilst all classrooms have a teacher computer, it is the
norm rather than the exception for this to be the only computer present in the majority of the classrooms; the
exception to this norm is the elective, career-specific classes that provide computer specific instruction. However,
students not enrolled in these specific computer-related elective courses are not permitted to use computers within
these classrooms.
Data Collection
Due to the previously discussed limited ICT access students at this high school encounter, and also to
remain within the scope and focus of this qualitative study dealing with how students use ICT outside of the
traditional school setting, it was necessary to give a qualifying survey to the potential pool of students participating
in the study. For the purpose of this study, participants were required to have the pre-determined qualifying factor of
having access to a computer equipped with Internet access at their household. However, if participants had
immediate Internet access by means of a friend or relative’s household, this would also qualify a participant to take
part in the study. Participants having access to the school library, career center, and public library did not qualify as
participants of this study due to the mitigating access limitations.
Qualifying surveys were administered to 240 students currently enrolled the two smaller learning
communities discussed in the previous section. Although used as a means to qualify participants to take part in the
qualitative study, the qualifying surveys provided valuable data. Student participation in completing the qualifying
survey resulted in 100 percent of the 240 surveyed participants.
Participant responses were recorded and scored according to participants’ response to the second question:
Have you ever used the Internet at home to help you complete a school assignment. A positive response to the first
option (yes) or the third response (no, but I have used a friend’s or relative’s computer to complete an assignment
when I needed to use the Internet) were determined as qualifying responses to participate in the interview phase of
the study. Respondents who had immediate access to the Internet at their household represented 140 students (58%)
and who had access to the Internet at Friends’ or relatives’ 28 students (12%) of the total 240 students participating
38
in the qualifying survey. From the data records of the 168 (70%) qualifying participants, eleven students were
randomly selected to further participate in the next phase of the qualitative study. Of these randomly selected eleven
students, ten qualified as participants by responding that they had ICT access at their household. The one randomly
selected student participant responding to having immediate access at a friend’s household subsequently gained
household access between the time span of administering the survey and the commencement of the interviews.
Interviews were arranged during the students’ sixth period class in agreement and coordination with each
individual student’s instructor for that class subject. Due to this arrangement, interviews were conducted over a six
week period to ensure that students would not miss important instructional material covered while they were pulled
out of their class for the interview. In order to eliminate distractions, the interviews were conducted in a classroom
not being used for instructional purposes during this class period. Interviews with the participants lasted
approximately twenty to thirty minutes. The participant interviews were tape recorded to ensure the development of
accurate transcripts of student responses and for validation purposes. Upon the completion of the final interview,
and a preliminary analysis of the typed transcripts, follow-up interview questions were developed for the eleven
participants so as to deal with clarifications as well as investigation into areas not considered in the development of
the original interview questions. Follow-up interviews were briefer, averaging approximately five to ten minutes in
length.
A preliminary analysis of original participant interviews and subsequent follow-up interviews led to the
necessity of developing an open-ended questionnaire to be administered to teachers at the high school. This
questionnaire was administered to four teachers identified in student interviews as teachers that encourage or engage
students in ICT use that led them to use ICT outside of the teachers’ classroom. Due to time constraints and
scheduling difficulties, the questionnaires were sent as attachments via email. All four identified teachers
participated in responding to the questionnaires, returning completed questionnaires via email.
Student interview transcripts were validated by the interviewees to ensure accuracy of responses and to
eliminate erroneous inferences by the researcher. The same procedure was followed for student follow-up
interviews. This validation process was unnecessary for the written responses generated by the email attachments for
the teacher participants.
Data Analysis
A content analysis was conducted on the original interview responses by the eleven participants.
Subsequent follow-up interviews were also analyzed in the same manner. Teacher questionnaires were originally
analyzed independent of student interview responses in original and follow-up interviews. A subsequent analysis of
teacher responses was conducted collectively with student responses in regards to the research questions.
Data were first analyzed in conjunction with the original three research questions. Participant responses
were codified and evaluated using conceptual analysis within the structure of the three original research questions.
This conceptual analysis involved all three sets of data: original student responses, follow-up questions, and teacher
responses. Upon the completion of the conceptual analysis, a relational analysis was conducted upon the three sets
of data external of the three original research questions. This approach was undertaken in an effort to identify
themes, trends, and patterns not originally considered by the researcher in the framing of the original questions in
regards to ICT use by students outside of the classroom. Upon completion of the conceptual and relational analyses,
concepts and subsequent inferences derived from the two analyses were peer reviewed to ensure the validity of the
findings. This peer review was conducted by a teaching colleague working at the same school site in a separate
academic department. The peer reviewer was knowledgeable about ICT use and was considered to be a technology
advocate for students at the school site.
[I use the Internet] a few times a week. Two, three, or four times (S3).
A few students commented that they daily use ICT in the completion of schoolwork. One student, an English
language learner, commented:
39
Almost everyday I use the Internet to help me with schoolwork (S6).
This high frequency of ICT use was also seen in the one student interviewed that did not have Internet access at
home, but would go to a friend’s house or the public library:
I say at least 3 to 5 school days because I am always on the Internet working on school stuff (S4).
40
Usage Patterns of Students Using ICT to Complete Assigned and Unassigned Work
The usage patterns of economically-disadvantaged students in this study yielded findings that students are
using personal, web-based email accounts as an educational resource. Student use of email was found to occur as a
means to email teachers assignments, collaborate on group assignments, and use this technology as a file storage
system. Student participants in this study do not have a school-based email account, leading to the use of web-based
email systems such as Yahoo, Gmail, or hotmail.
Student use of email to correspond with teachers does not appear to be a common pattern in their usage. Only one
respondent claimed to have emailed their teacher:
We had to use the internet to email our teacher, Mrs. W. so that we could get our homework. I checked the
email at home and at school (S8).
This finding is surprising considering the responses by most students in the study responding that email usage was
their most frequent use of ICT at their household:
I find myself checking my email mostly. It’s the first thing I do when I’m at home or whenever I use a
computer at school (S3).
The use of search engines in order to complete schoolwork was a recurrent theme in student ICT patterns of usage.
Students mentioned using a variety of search engines as a major part of completing schoolwork at home. Search
engines that were commonly mentioned included: Google, Yahoo, and Ask Jeeves. As a repeated activity, students
positively responded that searching was a major ICT activity undertaken to complete schoolwork at home:
I have been spending all my time working on the term paper for Mr. F. I looked up the years, the timeline. I
went to Google to find the timeline (S2).
The use of search engines was identified as a major area of ICT use outside of the classroom. Student participants
were confident about their abilities to conduct searches and generate the desired results in order to complete home
work. This finding supports the finding by Fallows (2005) who found a large number of respondents, 92 percent,
reporting high levels of confidence using search engines. Teacher involvement in instructing students on best
practices when conducting searches was evident in the respondents. There is evidence that this ICT skill is being
taught within the classroom curriculum in earlier grades such as middle school and perhaps even at the elementary
level.
Along with the pattern of the frequency of using search engines, student participants were generally
confident about their ability to search and their effectiveness in finding what they were tasked to find. Along with
confidence, students were quick to describe their searching strategies. Specifically, strategies were discussed when
students did not get the desired search results that were needed to complete an assignment. Rewording searches was
not the only pattern observed. Students also cited switching search engines in an effort to achieve the desired results:
If one [search engine] doesn’t work, I will just use another one. Like if Yahoo doesn't work, I'll use Google.
I'm not really into getting into something deep (S4).
Unlike email skills, teachers were attributed with being the driving force behind students’ acquisition of Internet
search strategies:
…back then, in middle school, teachers taught us how to do these things (S2).
I learned this [search strategies] from all my teachers. But mostly from Mrs. W, my English teacher, and
Mr. F [business teacher] (S8).
Like email and the utilization of search engines, students are utilizing a variety of ICT educational resources as
supplements to complete homework. Moreover, these supplemental resources are being utilized without teacher
direction or encouragement. One such supplemental resource is the use of online dictionaries. Many students
interviewed discussed their use:
I have a list of links saved in my favorite’s folder like dictionary.com a thesaurus site (S6).
The use of chat rooms and message boards do not presently appear to be exploited by high school students, however,
it is worth noting that students are aware of this resource and discussed how it is being used by other students:
I post on a message board, and I see that a lot of kids are going there for [school] help. I remember just the
other day, a girl was doing an assignment on Macbeth, and she wanted us to help her write an obituary for
one of the characters. Or they ask for help to find research material that they couldn't find themselves, and
they ask if we could help them find it. This is something that I have noticed on this board recently. I have
seen this done on other boards, but you would have to search for them. I will probably use this resource in
the future because it is convenient and right there in your home (S5).
41
Internet was a tool that aided them in completing work in a quicker and easier manner than traditional educational
resources. This positive view is explained by a student:
It [ICT] makes it so easy to complete homework. I wish all of my teachers would give more assignments
that I do at home on my computer (S4).
This high degree in the level of confidence expressed by the student was a recurrent theme in the majority
of student responses to feelings about ICT use. Likewise, students expressing a high degree of confidence in using
this tool can be seen as a major facilitating factor driving students to utilize this resource:
As soon as teachers give an assignment, I'm glad that I have the Internet. The Internet is reliable. I see it as
it's there for you to learn (S4).
Teacher instruction and direction in best practices regarding Internet use is seen as also a causing factor to
these high levels of confidence
Yes I do [feel confident] because my teachers show me how to use the computers very well (S8).
In spite of past experiences and teacher instruction, the majority of participant interviewees responded that
they desired more support and help from classroom instructors. Commenting on the history term paper, one student
noted a desire for more specific support:
Mr. F did not help us on how to use it. He only showed us how to do the works cited page. I would like
more help because it would be easier and quicker, and that is how you get by these days (S4).
However, participants expressed doubt about teacher expertise and experience to provide them with the
ICT support they needed:
If they know how to use it then yes, I would like support. But most of the teachers don't even know how to
use it themselves. This is because they are older and have been using what they have been teaching with
themselves. This is just new to them (S5).
Teacher respondents also supported this finding that instructional time was not devoted to support or the
teaching of best practices in the use of ICT. The math teacher surveyed suggested a desire for students to learn
independently:
I teach them some key words to search for to find the things that are applicable to their business. I
do not tell them too much since I want them to be creative and to have to find the terms that they need (T2).
The findings by research (e.g. Nachmias et al., 2000) regarding the overwhelming use of ICT for
recreational activities by high school students, was also found to be a recurrent them in student interview responses.
However, the ability of high school students to utilize these recreational attributes of ICT whilst concurrently
working on school assignments is another facilitating factor in students’ choice to use ICT resources to complete
homework. The multi-tasking capability of listening to music whilst working on school-related assignments was a
recurrent theme:
It’s like you can open up Media Player and listen to your music at the same time as you do the work. That’s
what I like about it (S11).
However, listening to music, a desire to play games, and going to unrelated websites were other areas of
distraction identified by students as hindrances to completing homework. The ability to play an interactive
game whilst doing homework was also explained as a multi-tasking function:
Games also distract me, I like Pogo. I am tempted [to play], and I have two windows open so that I am
doing two things at the same time (S9).
42
first be studied and analyzed in an effort to identify strengths and weaknesses of each particular resource. For
example, best practices in the use of email as a collaborative tool is one such resource that has potential to be
exploited by educators in the completion of a multitude of assignments.
The findings related to the consensus of teacher attitudes regarding the explicit assigning of ICT-related
homework need further investigation. The four teachers surveyed at this high school were identified by student
interviews as being teachers who assigned ICT-related homework and projects that could not be accomplished
during class time. Therefore, the confidence by these teachers that students can overcome obstacles and that self-
motivation will lead students to find ICT access outside of the classroom needs further study. Likewise, studies
involving multiple high schools from large, urban and economically-disadvantaged neighborhoods are needed to
study these teacher perceptions and beliefs. Studying teacher beliefs regarding student ICT use would also ascertain
if pre-conceived notions lead to the hindrance or facilitation of the support of student use of ICT as a supplemental
resource at their households.
The ability of students to manage home computers and maintain the operability of software and operating
systems can be attributed to skills learned in graduation-required computer literacy courses. The skills learned in
these courses, and other teacher instructed lessons have contributed to the elevation of the high school student in the
household as the responsible party for the upkeep of the family computer. As trends call for the elimination of such
courses and place the teaching of ICT skills within the core content curriculum, valuable computer maintenance
skills and Internet security strategies may not be gained by these responsible students. The continued offering of
these courses and this important component within the curriculum is needed.
References
Baig, E. (2005, January 31). Study shows some teens not as web-savvy as parents. USA Today, Money, 01b.
Cheung, W. & Huang, W. (2005). Proposing a framework to assess Internet usage in university education: An
empirical investigation from a student’s perspective. British Journal of Educational Technology, 36(2),
237-253.
Cuban, L., Kirkpatrick, H., & Peck, C. (2001). High access and low use of technologies in high school classrooms:
Explaining an apparent paradox. American Educational Research Journal, 38(4), 813-834.
Fallows, D. (2005, January). Search engine users: Internet searchers are confident, satisfied, and trusting - but
they are also unaware and naive. Retrieved July 14, 2005 from Pew Internet and American Life Project, Web
site: http://www.pewinternet.org/pdfs/PIP_Searchengine_users.pdf
Hsu, S., Cheng, Y., & Chiou, G. (2003). Internet use in a senior high school: A case study. Innovations in
Education & Teaching International, 40(4), 356-368.
Landgraf, K. M. (n.d.). The fourth basic literacy: Technology. Educational Testing Service. Retrieved July 11,
2003 from http://www.ets.org/aboutets/issues17.html
Levin, D., & Arafeh, S. (2002, August). The digital disconnect: The widening gap between Internet-savvy students
and their schools. Washington, DC: Pew Internet & American Life Project.
Madden, A., Ford, N., Miller, D., & Levy, P (2005). Using the Internet in teaching: The views of practitioners (A
survey of the views of secondary school teachers in Sheffield, UK). British Journal of Educational
Technology, 36(2), 255-280.
Murray, L., Hourigan, T., Jeanneau, C., & Chappell, D. (2005). Netskills and the current state of beliefs and
practices in student learning: An assessment and recommendations. British Journal of Educational
Technology, 36(3), 425-438.
Nachmias, R., Mioduser, D., & Shemla, A. (2000). Information and communication technologies usage by students
in an Israeli high school. Journal of Education Computing Research, 22(1), 55-73.
Nomura, K. & Samuels, J. D. (2004, August 27). Can we go back to data? Reconsideration of U.S.-harmonized
computer prices in Japan. Retrieved July 1, 2005 from Harvard University, John F. Kennedy School of
Government Web site: http://www.ksg.harvard.edu/cbg/ptep/IT_price.pdf
Papastergiou, M. & Solomonidou, C (2005). Gender issues in Internet access and favourite Internet activities among
Greek high school pupils inside and outside school. Computers & Education, 44(4), 377-393.
Reiser, R. (2002). A history of instructional design and technology. In R. Reiser & J. Dempsey (Eds.), Trends and
issues in instructional design and technology (pp. 36-37). Upper Saddle River, NJ: Merrill Prentice Hall.
Wilhelm, T., Carmen, D., & Reynolds, M. (2002, June). Connecting kids to technology: Challenges and
opportunities. Baltimore, MD: Annie E. Casey Foundation.
Woessmann, L. & Fuchs, T., (2004, November). Computers and student learning: Bivariate and multivariate
evidence on the availability and use of computers at home and at school. Paper prepared for CESIFO
Category 4: Labour markets, Munich, Germany.
43
Turkish Teachers' Experiences In Online Professional Development
Environment
Bahar Baran
Kursat Cagiltay
Middle East Technical University
Ankara, Turkey
Abstract
This paper aims to explore teachers’ opinions on traditional professional development (PD) courses and
their experiences from an online course. For this aim, a qualitative research was designed and 10 teachers from a
private school evaluated the learning process after the course. A focus groups discussion and individual interviews
were performed to collect data. According to results of our research, the teachers generally emphasized the lacks of
practice in both traditional and online courses. The results of this study can be used as lessons learned by designers
and decision makers wanting to develop online professional development environment.
Introduction
Since our future depends on especially teachers, their professional development (PD) is significant. From
years, lots of PD projects were arranged with different aims on the world. For instance, Department of In-service
Training under Turkish Ministry of Education (MNE) offered 428 courses for only 22.398 personals. That is, only a
small group teachers over 600 thousand teachers could benefit from those courses in 2004 (MNE, 2004). While
whether or not these projects are sufficient for teachers life long learning can be discussable topic among educators,
these projects were also evaluated with different perspectives by researchers. At the result of the studies, it could be
seen clearly that professional development improved teachers’ instructional skills (Borko & Boulder, 2004).
However, there is some dissatisfactions from professional development programs. Although generally teachers are
willing to participate in PD programs, they also reported problems (Ozer, 2004). Gabriel (2004) describes lacks of
professional development as 1) Top-down decision making, 2) The idea that teachers need to be “fixed”, 3) Lack of
ownership of the professional development process and its results, 4) The technocratic nature of professional
development content, 5) Universal application of classroom practices regardless of subject, students’ age, or level of
cognitive development. 6) Lack of variety in the deliver modes of professional development, 7) Inaccessibility of
professional development opportunities, 8) Little or no support in transferring professional development ideas to the
classroom, 9) Standardized approaches to professional development that disregard the varied needs and experiences
of teachers, 10) Lack of systematic evaluation of professional development, 11) Little or no acknowledgement of the
learning characteristics of teachers among professional development planners (pp.2-4).
Each of talked problems guided researchers to search new approaches for teacher training. Schaler and
Fusco (2003) stated, “Teachers professional development is more than a series of training workshops, instates,
meetings, and in-service days. It is a process of learning how to put knowledge into practice through engagement in
practice within a community of practitioners” (p.205). In the last decades, Wenger (1998) has proposed communities
of practice as a social learning theory which took attention of educators a lot too. Communities of practice is defined
as “groups of people who share a concern, a set of problems, or a passion, about a topic, and who deepen their
knowledge and expertise in this area by interacting on an ongoing basis.” (Wenger, 2002, p.4). The application of
the theory in teachers’ professional development has taken considerable attention of educators, recently.
Today, teachers’ life long learning needs and dissatisfactions to traditional courses are taken into
consideration, a training serving lots of teachers and providing time-place independency as social learning
environment can be seen as future of the teacher training. Indeed, this type of training comes to educators the
concept of “online professional development”.
Although most of the teachers use computers daily, they prefer to use technology to make their daily life
easy instead of using instructional purposes. They do not adapt technology sufficiently owing to their students’
demands (Molenda & Sullivan, 2002). Teachers are generally considered as late adaptors because they met the
technology late. Therefore, also, participating in online professional development enhanced with technology may
change their attitudes, frequency of use and form of use technology.
44
The Purpose and Research Questions
In the light of teachers’ professional development needs, a research project based on online teacher training
has been designed. The research project in underway has two phases. The aim of the first phase of the project is to
develop an online learning module for teachers’ professional development, while the aim of the second phase is to
provide practice sharing among teachers by interactive communication tools in an online environment. This study
focuses on experiences on the first phase. Designed online learning module fits Clark and Mayer’s first type
“Receptive: information acqusion” (p.28). That is, it provides information to teachers. We hope that the results of
the first phase will light the way of other online training projects for teachers’ PD because the lessons learned from
the first part will present informative findings.
The purpose of the study is to explore teachers’ opinions on traditional PD courses and teachers’
experiences in an online course. To achieve this aim, we discussed the teachers’ prior PD experiences, comparison
with traditional and online learning experiences, their evaluation of the online learning module and finally their
expectations from online courses. So, the research has 3 main research questions. These are
How are in-service teachers’ past experiences on professional development?
How are in-service teachers’ experiences in an online professional development environment?
What do the teachers think about online learning?
Method
This methodology section contains the research method employed in the study. Especially the type of
qualitative tradition, participants, context, data collection and process of data analysis will be presented.
Participants
The school and the teachers were selected intentionally by the researchers. Firstly, the school to be studied
was selected. The researchers preferred to study in a private school because of technology limitation in other public
schools. It was assumed that if teachers have easy access to computers in their schools, there couldn’t be any
problem to access the module. The school had computers in both its laboratories and lounges of each teaching units.
Then, 10 teachers having different teaching disciplines from the school were selected. Main characteristics of
teachers are described in Table 1.
Table 1. Participants
Teach Teachi Teachi Teaching Teachi Worki
ers ng ng in Field ng ng
durati privat level hour
on e
school
Nazla 13 11 English K-6,7 23
n language
Pınar 5 5 Science K-6-8 23
Meral 17 15 Psycholog K-1-8 30
ical
adviser
Zulal 38 13 Turkish K-6 10
language
Cihan 3 2 Mathemati K-6-8 20
c
Seçil 4 4 Biology K- 17
month month 9,10
s s
Melte 12 10 Computer K-4-8 10
45
m
Nurgu 12 12 Social K-7,8 15
l science
Onur 2 2 Computer K-1-8 21
Ilker 1 1 Computer K-1-8 20
Context
The teachers were participated in online course during one month. Before the course, teachers took a
seminar about what online learning is and how they can use the learning module.
In the study, the online learning module developed by an online learning software company was used. The
topic of the learning module was about learning theories. The module content was prepared by a full time
academician, his degree on educational sciences, working in the Department of Educational Sciences in the Middle
East Technical University. The learning module has voice, pictures, and animations to support the text. User
interface of the module can be seen in Figure 1.
Buttons
- Note-pad Lesson screen
- Play and
stop
- Help
- Settings
Data Collection
Experiences of teachers were evaluated by focus group interviews and individual interviews. Yıldırım and
Simsek (2004) describe advantages of interviewing are 1) flexibility, 2) reply rate, 3) behaviors not being seen, 4)
the control of the environment, 5) the order of the questions, 6) comprehensiveness, and 7) in-dept information
(p.110).
A focus group interview was conducted to create a discussion environment about the learning module.
Group meeting was arranged after teachers finished the daily teaching mission. Further, a semi-structured individual
interview was conducted with each teacher. Two types interviews were hold through face-to face meetings. Before
going to collect data, the researchers prepared an interview schedule. This schedule served as a way to get
information about how teachers evaluate their past professional development experiences, how teachers evaluate
their virtual experiences. The questions were open-ended and specific. The researcher tried to avoid leading the
respondent to confirm the researcher’s assumption. However, some probes presented to get detailed answers.
Data Analysis
Data analysis process has a bottom-up procedure. After interviews were transcribed, the data was coded.
During this process, general themes were found and finally, the data were organized according to the general
themes.
46
Results
Teachers
The study involved 10 teachers working in a private school. While only 3 of 10 teachers are males, 7
teachers are females. The most experienced teacher is Zulal (38 years) and the most novel teacher is Secil (4
months). Statistical description of the participants’ teaching experience, M = 10.33 years, Median = 8.5 years and
Mode= 12 years. Further, if the teachers were examined according to their experience having from private schools,
the most experienced teacher is Meral, the most novel teacher is Secil (Figure 2).
40 38
Teaching duration
35
30 Teaching in private
Duration (Year)
school
25
20 17
15 13 12 12
10
5
5 3 2
0,3 1
0
n
ar
al
er
r
l
il
l
an
la
gu
nu
la
ç
er
te
Ilk
n
Zu
Se
ih
az
ur
Pı
O
el
M
C
N
N
M
Teachers' names
Moreover, the teachers have different teaching areas. These are English language, science, psychology, Turkish
language, mathematic, biology, social science, and computer. Additionally teachers teach grades from K1 to K8.
Professional Development
This part of the article presents us teachers’ professional development backgrounds, their expectations from
professional development programs and problems they face in teacher training courses.
47
experiences.” As a novel teacher, Pınar explained her ideas as “I couldn’t get any PD courses but I am a PhD
student. I have learned necessary information from my PhD program. Therefore, I don’t need any in-service course”.
She matched professional development courses with PhD courses as she thought that content of the training was the
same.
Online Learning
This part of the article includes information related to the teachers’ evaluation of their own experiences and their
suggestions for other online learning environments.
48
online learning course after this application. Although the learning module can be well designed and well organized,
as the topic of the module was not selected appropriately online learning seems me boring and incomprehensible”.
Pınar explained that she have already known learning theories. She had obtained this knowledge from the university
education. That is, the learning module couldn’t make any contribution to her. Further, Nurgul also explained her
un-satisfaction as “I do wanted to learn the content of the module. However, I could not understand its jargons.
There are too many jargons.” More specifically, another critique is toward the concepts in module. Teachers wanted
to see more widespread use of the concepts. For example, the concept of collaborative has 2 different translations in
Turkish (“isbirlikci” and “kubasik”). The teachers couldn’t understand of the mean of isbirlikci. They know
collaborative as “kubasik”.
There were teachers thinking positively about the online learning module. According to Nazlan, learning
module provides time and setting flexibility to her. Further, she believed that she learned name of teaching method
she have, from the learning module. “I learned from the learning module that I have already a teaching approach and
there is a specific name in the literature of my teaching approach. That is, certainly I am a humanistic teacher. All of
them are appropriate for me. I found out my self”. Cihan said “I liked much. I went to my school to access to it on
Saturday. I spent extra time and effort to learn. I got a cup of coffee and sat in front of the computer. It was funny
for me.”
Teachers also made some negative criticism about the online learning module. Most of the teachers can not
remember any animation or picture placed in learning environment. One of the teachers, Meral explained the reason
of the problem as “my field composes learning theories. Therefore, I understand my peers’ problem to learn that
topic. The biggest problem related to learning theories is that teachers couldn’t make it concrete. When you teach
description of learning or description of classical conditioning, they can not make them concrete. You should give
them more specific examples. For instance, you can give a picture depicting bell with running children to their class
instead of lemon picture for classical conditioning” (Figure 3). That is, the teachers especially emphasized the
importance of giving specific examples for specific cases.
Teachers also had some technical problems. Internet Explorer pop-up blocker warning was new for them. They can
not solve it. Therefore, they preferred to call computer teachers. Most of them overcame the problem with this way.
49
environment has this attribute, I may prefer online learning”. Seçil said “…online learning can contribute my
teaching. I have a lot of classroom management course books. Suppose that I have a problem in my school. I open a
book. However, if I had a support on the Internet, I would prefer it”. Some of the teachers proposed courses
presenting practical knowledge. Pınar stated definitely that she didn’t want to get a course having theoretical base.
She needs application of these theories. She explains her ideas as “I believe that teachers do not interest in reading
constructivism. They have no time for this. When I am connected to Internet, I am looking for figures or questions
classified according to topics. Indeed, I want to get information on whether or not this figure is appropriate for my
lesson. For example “I enter “small intestine” and I can find all pictures related to this topic. The most important
thing is whether or not these figures develop students’ thinking skills.
The teachers also proposed that virtual courses should be classified according to teachers’ fields and the
courses include new approaches, novel things, projects, presentations, films, pictures. Finally, they suggested that
virtual courses should be prepared via the collaboration of academicians and expert teachers. Nazlan proposed that
virtual courses should have both theoretical and practical knowledge. Theory is also important for her. They need
both of them. “I want to learn interesting topics including practice. For example, courses on differences of teaching
methods in Turkey and other countries, philosophy under new methods and their advantages and benefits”. Further,
she believes that she, as an English language teacher, has more chance than other teachers since she has used
learning theories for a long time. According to her, English teachers have sufficient experience to apply these
theories into their classroom. However, other teachers in different fields have not applied multiple intelligence
theory, yet and firstly, they need to learn this learning theory more detailed.
Another topic discussed with the teachers is whether or not other teachers want to use online learning. Zulal
underlined the reason of the teacher’s hesitation with these words “Innovations are not accepted, suddenly. Internet
is also a new thing for teachers. Therefore, old teaches does not accept it voluntarily. However, I see young teachers
of my school. They are very enthusiastic about online learning. I can see it.” Other teachers agreed the necessity of
online learning courses for teachers when they discussed the geographic wideness of Turkey. Pınar said that “we can
access all materials or books when they are necessary. However, most of Turkish teachers can not obtain these
sources when they need. I believed that online learning will be effective for them.” However, the teachers stressed
that there can be some limitations for Turkish teachers to prefer online learning courses. These limitations are
unfamiliarity with computers, lack of effective computer use, computer anxiety, lack of computers, and lack of
internet in the area. Nazlan explained her ideas as “…they didn’t bring up with computers, so these people who can
not use computers efficiently have a computer anxiety. That is, they can be afraid of using or turn on the computers.
They will have a lot of questions such as whether or not the computer is crashed and broke down by him/her selves”.
Pınar adds “I don’t know whether or not they have the Internet”. Onur stated that we should educate teachers on
computer use before providing the virtual professional development courses.
In addition to negative opinions, the teachers also mentioned some positive contributions of online
learning to Turkish teachers. Firstly, the virtual course will help teachers who can have the location problem. These
environments make easier to access the materials. Further, since they can use these environments without exploring
from a library, they can save time. Finally they said that these environments would be more comfortable and
convenient for teachers.
Conclusion
In the study, mainly, we tried to enlighten teachers’ experiences in an online environment. So, firstly to
learn their past experiences in traditional PD courses was important. The teachers listed problems in face to face
courses as unattractive topics, familiarized topics, forcing to participate in courses, academician tutors not having
any school experience, and the absence of the practical knowledge.
Mostly emphasized topic in problems was absence of the practical knowledge. This topic also was the most
repeated topic while teachers evaluate the online module. Therefore, we conclude that what type of learning
environment-online or face to face- is presented to teacher is not important. The most valuable thing is that in-
service teachers should be satisfied with practical knowledge. It can be seen this conclusion from their course topic
preferences. They preferred material development and classroom management course than theoretical based courses.
Further, teachers wanted to get professional development (PD) in homogeneous groups. That is, a
mathematic teacher should have participated in the course with only math teachers. So, teachers could produce
knowledge more related in their own field because they wanted to learn other mathematic teachers’ experiences.
They need this homogenous group to be able to exchange their practical knowledge with other teachers.
THEORY 50
PRACTICE
Figure 4. Comparison of theory and practice
References
Bogdan R. C. & Biklen, S. K. (1998). Qualitative research for education: An introduction to theory and methods.
Allyn and Bacon, Boston.
Borko & Boulder (2004).
Gabriel, D. M. (2004). Teacher-Centered Professional Development Association for Supervision & Curriculum
Development. Association for Supervision, VA, USA.
MEB, (2001). Educational statistic. Retrieved September, 2, 2005
http://www.meb.gov.tr/Stats/apk2002ing/apage118-129.htm#1
MEB, (2004). Department of in-service education. Retrieved June 11, 2005, from http://hedb.meb.gov.tr/
Molenda, M. & Sullivan, M. (2003). Issues and trends in instructional technology. In Robert Branch (Ed.)
Educational Media and Technology Yearbook 2003. Englewood, CO: Libraries Unlimited
Özer, A. (2004). In-service Training of Teachers in Turkey at Beginning of the 2000s. Journal of In-service
Education, 30(1), pp.89-100
Schaler, M. S. & Fusco, J. (2003). Teacher professional development, technology, and communities of practice: are
we putting the cart before the horse. Information Society, 19, pp. 203-220.
Wenger, E. (1998). Communities of practice: learning, meaning, and identity. NY: Cambridge University Press.
Wenger, E., McDermott, R. & Synder W. M. (2002). Cultivating communities of practice. Boston: Harvard
Business School Press.
Yıldırım, A. & Şimşek, H. (2000). Sosyal bilimlerde nitel araştırma yöntemleri (2nd ed). Ankara: Seçkin Yayınevi
51
Turkish University Students:
How Do They Use Technology and What Do They Think about Internet Based
Distance Education?
Bahar Baran
Eylem Kilic
Aysegul Bakar
Kursat Cagiltay
Middle East Technical University
Ankara, Turkey
Abstract
In this paper, the researchers will present results from Turkish students’ profiles related to Internet based
distance education. For this purpose, a descriptive research study was conducted, and total 6504 surveys were
returned from four universities in Turkey. The results of the research were reported into five main titles, description
of the students, Internet and computer use opportunities, computer skills, studying style, expectations from Internet
based distance education.
Introduction
Today, the global knowledge economy requires qualified labor. In Turkey, the need for qualified employers
has been emphasized for many areas. Vision 2023 report conducted by the Turkish Scientific and Technical
Research Council (TUBITAK) has also emphasized the importance of qualified employers for the country. The
report proposes the reconstruction of Turkish Higher Education to catch up with the speed of the information era.
According to the results of the report, the aim of new educational system should be “to develop individuals’
creativeness, to provide learning opportunities for individuals, to improve their skills at the top level by taking into
consideration their individual differences, to provide time and place independence, to be flexible to improve
capabilities, and to focus on an educational approach emphasizing on learning and human values” (Tubitak, 2004,
p.11). Since there is a high demand for university education, it is obvious that educators will have been searching
alternative solutions to educate potential university students. One of the most promising ways to create new
opportunities for the youth population is to effectively integrate technological innovations into conventional
education.
Distance education can be defined as a type of education in which learners and instructor are divided in
terms of time and place (Gunawardena & McIsaac, 2001). Investigating the development of distance education, it is
possible to say that a great change has occurred as technology improves. Early, applications of distance education
programs include corresponding courses which take a long period of time for learners and instructor to interact.
Recent developments in technology have brought the use of Internet in higher education in addition to conventional
learners (Molenda & Sullvian, 2002) which can now provide quick and easy interaction for distance education.
Before changing the educational system completely, an analysis should be made of the present conditions in order to
prevent a failure occurring. Since our era necessitates learner centered educational approaches, identifying learners’
characteristics, expectations and the opportunities are value regardless of the technology used in distance education.
In other words, analyzing whether using latest technologies for education meet the expectations and needs of
learners are important before starting to use Internet based education at university campuses.
Being in the core of learning, learners are the most important components. Therefore, fundamental issue in
an educational environment should be to support good quality teaching and to provide variety of learning
opportunities for learners. Therefore, prior to design of an educational program, the learners’ expectations and
profiles should be investigated. In this research, the target group was Turkish university students in four distinct
universities of the country. The researchers’ aim was to find out Turkish students’ Internet and computer use
opportunities, their level of computer use, learning styles and expectations from e-learning. In sum, the purpose of
the study was to determine Turkish students’ profiles and expectations related to Internet aided education.
52
Methodology
Knupfer & Mclellan (1996) emphasized the importance of the descriptive research methodology in
educational research. This type of research studies is essential to understand the nature of the environment without
any extraneous variable. So, descriptive research methodology was used in this study. The researchers wanted to
explore Turkish university students’ expectations from Internet based education.
The Context
The data of this research were collected for the need analysis stage of the e-campus project managed by the
Middle East Technical University. This project aims to create new student capacity for higher education by using
information and communication technologies. The project aims to provide both lifelong learning opportunities and
undergraduate education. Some parts of education will be given via e-learning and standardized learning materials
will be made available for all students from different universities. So, a consortium among some universities had
been organized to implement the project (Yalabık, 2004).
Sampling
The data was collected from four universities; Middle East Technical University, Kocaeli University,
Mersin University and Zonguldak Karaelmas University. In the population, the number of students of these
universities is orderly, 3115, 11681, 5802, and 6053 (Council of Higher Education, 2003). Return rate was 6504
responses from total 26.631 students. Thus, the numbers of returned responses according to universities were 4609
(%70, 9) from Kocaeli University, 1025 (%15.8) from Middle East Technical University, 529 (%8.1) from Mersin
University and 341 (%5.2) from Zonguldak Karaelmas University (Figure 1).
Kocaeli
70,9%
8,1%
Results
The results of the study was examined under five main titles, 1) Description of the students, 2) Students’
Internet and computer use opportunities, 3) The level of computer use, 4) Studying style, 5) Expectations from e-
learning.
53
in which participants registered were Faculty of Engineering and Architecture, Faculty of Arts and Sciences, Faculty
of Education, Faculty of Medicine and Faculty of Economic & Administrative.
The question about their gender was responded by 6454 (99%) of total 6504 students. According to
responses, the number of female students was 2214 (34%) while 4240 students (66%) were male. Investigating the
ratio of each gender group, there were more male students than females.
The question about their ages was responded by 6419 (99%) of 6504 students. The responses indicate that
4579 (70%) students were 21 years old and above, 1040 (16%) were 20 years old, 622 (10%) were 19 years old, and
178 (3%) were 18 years old and below. The dominant age group was 21 and above followed by the ages of 20, 19,
and 18.
Computer Skills
The students’ computer use skills were evaluated under 3 main topics including 11 subtopics. These main
topics are 1) computer basics (subtopics are hardware and operating system), 2) office applications (subtopics are
word processing, spreadsheet, presentation and database), and 3) Internet (subtopics are webpage development,
internet browsers, search engine, e-mail and chat).
The most familiarized subtopic for computer basics is operating system. Turkish students know more about
operating system than hardware. According to operating system rating, 6404 (98%) of total 6504 students answered
the item. They rated 1232 students (19%) very good, 2518 students (39%) good, 1676 students (26%) moderate, 734
students (12%) poor, and 244 students (4%) very poor. In sum, most of the students evaluated him/herself good
about operating system. According to hardware rating, 6414 (98%) of total 6504 students answered the item. It is
54
reported that there are 784 students (12%) very good, 1872 students (29%) good, 2177 students (34%) moderate,
1261 students (20%) poor, and 320 students (5%) very poor in hardware. In sum, most of students evaluated him/her
self at “moderate” about hardware.
Among office applications, the most familiarized one is word processing. It is followed by spreadsheet,
presentation and database applications. According to the responses to word processing, they consider themselves as
1377 students (22%) very good, 2424 students (36%) good, 1662 students (27%) moderate, 681 students (11%)
poor, and 264 students (4%) very poor. In sum, most of the students evaluated him/herself good about word
processing. According to spreadsheet ratings, 6409 (%98) of total 6504 students answered the item. It is seen from
the analysis of the item that there are 822 students (13%) at very good level, 1907 students (30%) at good level,
1970 students (31%) at moderate level, 1281 students (20%) at poor level, and 429 students (6%) at very poor level.
In sum, most of students evaluated him/herself at “moderate” level about spreadsheets. According to presentation
application ratings, 6410 (98%) of total 6504 students answered the item. It is seen from the analysis of the item that
there are 940 students (15%) at very good level, 1754 (27%) students at good, 1772 students (28%) at moderate,
1327 students (21%) at poor level and 617 students (10%) at very poor level. In sum, most of students evaluated
him/her self at “moderate” level about presentation preparation. At last, for database, 6328 (98%) of total 6504
students answered the item It is seen from the analysis of the item that there are 279 students (4%) at very good
level, 672 students (11%) at good level, 1209 students (19%) at moderate level, 2293 students (36%) at poor level,
and 1875 students (30%) at very poor level. In sum, most of students evaluated him/herself at “poor” level about
database.
For Internet, most familiarized topics are orderly e-mail, search engine, Internet browser, chat and web
page development. For e-mail, 6345 (98%) of total 6504 students answered the item. It is seen from the analysis of
the item that there are 2650 students (42%) at very good level, 2204 students (%35) at good level, 872 students
(14%) at moderate level, 395 students (6%) at poor level, 224 students (3%) at very poor level. In sum, most of
students evaluated him/her self at “very good” level about e-mail. For search engine, 6323 (98%) of total 6504
students answered the item. It is seen from the analysis of the item that there are 2258 students (36%) at very good
level, 2118 students (33%) at good level, 1034 students (16%) at moderate level, 565 students (9%) at poor level,
and 348 students (6%) at very poor level. In sum, most of students evaluated him/her self at “very good” level about
search engine. For Internet browsers, 6322 (98%) of total 6504 students answered the item. It is seen from the
analysis of the item that there are 2000 students (32%) at very good level, 2068 students (%33) at good level, 1095
students (17%) at moderate level, 670 students (16%) at poor level, 489 students (2%) at very poor level. In sum,
most of students evaluated him/her self at “good” level about internet browsers. For chat, 6345 (98%) of total 6310
students answered the item. It is seen from the analysis of the item that there are 1282 students (20%) at very good
level, 1500 students (24%) at good level, 1328 students (21%) at moderate level, 1147 students (18%) at poor level
and 1053 students (%17) at very poor level. In sum, most of students evaluated him/her self at “good” level about
chat. For webpage developing, 6321 (%98) of total 6504 students answered the item. It is seen from the analysis of
the item that there are 354 students (%6) at very good level, 648 students (%10) at good level, 1069students (17%)
at moderate level, 2033 students (%32) at poor level, 2217 students (%35) at very poor level. In sum, most of
students evaluated him/herself at “poor” level about webpage developing.
55
100
10 30 35 8 17
9
90 20 11 11 9
20 11
31 21 14
80 26 27 16 18
17
70 34 35
31
28 36 33
60 32 33 21
36 39 36 Very Poor
50
40 Poor
Percentages
30 27 42 24
29
30 19 17
36
Moderate
32
20
22 22 20 Good
19
10 13 15 11 10
12
0 Very Good
ba
ha
op are er
sp r o t e m
pr
d a ta
w a se
In g e
se et b elo
e-
ch
or g s
eb
te
es hee g
m ng r
r e ce
ar
at
si
rd
er
ta tion
d
r n de
ai
c
pa
ad ss
en
ch ows me
w
at
p ys
l
co
in
e
m
r
pu
in
t
v
t
e
in
e
e
p
nt
Computer skills
Figure 2. Computer skill levels (below 8% values were not reported)
Studying Style
The students were asked to define themselves in terms of completing their responsibilities. Since e-learning
requires independent work and self discipline which are very important, students’ studying habits were investigated
in the scope of the study. 6265 (96%) students answered this question and there were 236 missing data. 2886
students (46%) said that they fulfill their responsibilities before the due date. 2422 students (39%) said that they
usually fulfill their responsibilities on time. 957 students (15%) said that they need to be reminded in order to fulfill
their responsibilities.
Students were also asked whether they often need to be reminded for their homework due date by their
lecturers or not. 6303 (97%) of total 6504 students answered the question. 3431 students (55%) answered that they
rarely need to be reminded by lecturer, 2224 students (35%) reported that they sometimes need to be reminded and
648 students (10%) need to be reminded by lecturer for their homework due date. The students were asked to
compare study time for a traditional face to face course and a web based distance education course. 6243 (96%) of
total 6504 students answered this question. 2690 students (43%) answered that the study time for a course given on
the Internet is less than traditional course. 1795 students (28%) answered that they are the same. 1758 students
(28%) answered that study time for a course given on the Internet is more than a traditional course. The students
were asked to define themselves as a reader. 6336 students (97%) answered this question and there were 167
missing data. 4688 students (73%) answered that they are good as a reader and they do not need help from someone
to understand. 1480 students (23%) answered that they are moderate as a reader such that they sometimes need to
help from someone to understand. 169 students (4%) answered that they are poor as a reader and they often need
help from someone to understand.
56
(66%) said “yes”, 1040 (21%) students said that they are undecided and 640 (13%) said “no”. The students who
want to get a second diploma, master degree or certificate after their graduation were asked their preferences
program. 3852 students (59%) answered this question. 1068 students (28%) said that they want to attend “Economy,
Administrative science, Finance”, 1057 students (27%) “Information and Communication Technologies”, 799
students (21%) “Others”, 647 students (17%) “Education” program and 282 students (7%) “Mechatronic program”.
The students were asked whether or not they could go to the campus for exam and laboratory work in case
they enter a program after graduation. 4293 (66%) students answered this question. 2102 students (49%) said that
they could go to campus at anytime, 1110 students (26%) students expressed that if the laboratory is open during
weekends or nights, they could participate the exams and laboratory works, 1081 students (25%) said that they
would have difficulty in coming campus for even during weekend and night.
The students were asked what type of learning environment they prefer, in case they entered any programs,
4282 students (66%) answered this question. 2376 students (56%) said that they prefer traditional and online
programs, 1387 students (32%) traditional and 519 students (12%) online programs.
Conclusion
The results of the study show that most of the Turkish students want to attend a second certificate or
diploma program not only during their undergraduate education but also after graduation. It can be concluded that
most of the students want to improve themselves after their graduation. In another study, both traditional higher
education institution graduates and distance education institution (DEI) graduates stated that they prefer to continue
their education with DEI (Rüzgar, 2004). Therefore, it is obvious that Turkish students need distance programs. So it
is important to find out students’ preferences for educational method, the programs they want to attend and students’
characteristics. Most of the students expressed that they do not want to attend such a distance program either online
or conventional method. Their responses showed that they prefer to get education via mixture of conventional and
online methods (blended) after their graduation. One of the earlier studies also showed that Turkish University
students did not want to attend pure online education programs (Koçak & Kalender, 2002). Blended learning seems
a more appropriate method for Turkish students because most of the students expressed that if laboratory is not
opened at weekends or nights, they can not attend the exam and laboratory work and have difficulty in coming
campus for weekend and night during their second certificate or diplomas program.
Generally, Turkish students prefer to attend “Information and Communication Technologies”, Economy,
Administrative, Finance” programs during their undergraduate education. In addition, they also stated that they want
to attend graduate level of the same program after their graduation. Student readiness is also an important factor that
influences achievement of the program (Vonderwell & Savery, 2004). In regard to students’ readiness,
characteristics such as self regulation and self efficacy are important issues that need to be discussed. In an earlier
study, it is found that self efficacy is positively correlated with the achievement of the students in distance education
(Ergul, 2004). Although there is huge amount of literature about the relationship between self regulation and
achievement, it is found that there is no relationship between self regulation and achievement among Turkish
university student. The study shows that the students’ readiness in terms of self efficacy does exist. Most of the
student expressed that they don’t need help from someone while reading. In addition, most of the students stated that
they can fulfill responsibilities before the time is up and they rarely need to be reminded by their instructors for their
homework due date. These are valuable results that show that Turkish students are self regulated.
In the past, studies about Turkish university students and their online education preferences provided
limited information because of small sample sizes. In this study, four major universities of Turkey and their
undergraduate students participated to the research as subjects. How we have a more complete picture of this
population and based on their characteristics policy members can initiate new large scale Internet based distance
education programs in Turkey.
References
Council of Higher Education (2003). Student Statistics. Retrieved January 29, 2005
http://www.yok.gov.tr/hakkinda/2002-2003os.pdf
Ergul, Hulya (2004). Relationship between student characteristics and academic achievement in distance education
and application on students of Anadolu Universtiy. Turkish Online Journal of Distance Education-5(2)
Knupfer, N. & Mclellan, H. (1996). Descriptive Research Methodologies. Chapter 41 In Jonassen, D. (Ed.).
Handbook of Research for Educational Communications and Technology. New York: Macmillan.
Koçak, A; Kalender, A. (2002) Distance Education within a Campus: Case of Selcuk University. Turkish Online
Journal of Distance Education-3 (3).
Ruzgar, S., N. (2004). Distance education in Turkey. Turkish Online Journal of Distance 5(2).
57
Tubitak (2004). National Science And Technology Politics: 2003-2023 Strategy Document. Retrieved January 29,
2005 from http://vizyon2023.tubitak.gov.tr/strateji_belgesi-v211.Pdf
Vonderwell, S., Savery, J. (2004) Online Learning: Student Role and Readiness Turkish Online Journal of Distance
Volume: 3 Number: 3
Yalabık, N. (2004). E-Campus Project. Distance Education Workshop, Mersin University.
58
Developing Pedagogical Technology Integration Content Knowledge
in Preservice Teachers: A Case-Study Approach
Laurie Brantley-Dias
Mary B. Shoffner
Wanjira Kinuthia
Georgia State University
Abstract
This paper reports selected findings of a larger study examining the effects of case-based instructional
strategies on the development of Pedagogical Technology Integration Content Knowledge (PTICK) in alternative
teacher preparation students. This study is part of the Crossroads Project funded by the Preparing Tomorrow’s
Teachers for Using Technology (PT3) grant from the United States Department of Education. Sixty students
completed a 6-week course in technology integration in teaching methods at a large southeastern university.
Content analysis was used to examine student data: case study analysis and reflections. Student technology skills
and demographics were also considered. This paper will discuss initial results and implication for using case
studies with alternative teacher preparation students in order to develop PTICK prior to field experiences.
Introduction
According to Shulman and Shulman (2004), accomplished teachers are those who belong to a professional
community, possess a vision, have motivation to act, know what to teach and how to teach it, reflect and learn from
experience. Designing instruction is at the heart of teaching. This is a complex, intellectual process involving the
application of learning theories, design principles, communication channels and decision-making processes to solve
ill-structured problems. By nature, ill-structured problems contain ill-defined elements, vague goals, multiple
solution paths and evaluation criteria, and unique attributes that require teachers to make judgments about the
problem, pose solutions and, when necessary, defend their decisions (Jonassen, 1997). Designing reform-based,
technology integrated lesson plans is particularly challenging for pre-service teachers who lack content knowledge,
pedagogical content knowledge and pedagogical expertise of their more experienced colleagues. Thus, teacher-
educators face a challenge when it comes to preparing the best possible teachers.
The journey from novice to expert is not one that results directly from instruction but rather from
professional maturation and experiences (Lave & Wenger, 1991). According to Dreyfus and Dreyfus (1986), a
person travels through five stages on the way to expertise: novice, advanced beginner, competence, proficiency and
expertise. It is generally accepted that expertise is acquired through much longer exposure to content than one
course could provide. For alternative teacher preparation students, the learning time is particularly condensed. With
this comes an increased sense of urgency to move from the novice stage to the advanced growth stage. Therefore,
how can an introductory technology integration course provide opportunities for alternative teacher preparation
students, whose classroom placement is immediately pending, to develop problem-solving skills more in-keeping
with an expert?
Theoretical Framework
Shulman (1987) proposed that there are seven categories of knowledge that underscore teachers’
knowledge base for effective teaching: content, pedagogical, curriculum and pedagogical content knowledge (PCK),
in addition to knowledge of learners, educational contexts and educational purposes. Of these, PCK is perhaps the
most influential in redesigning teacher education courses and programs (see NCATE Unit Standards). According to
Shulman (1986), pedagogical content knowledge (PCK) is a specific category of knowledge “which goes beyond
knowledge of subject matter per se to the dimension of subject matter knowledge for teaching” (p.9). It is the
teachers’ ability to identify learning difficulties and students’ misconception combined with the fluidity to transform
subject matter using “the most powerful analogies, illustrations, examples, explanations, and demonstrations—in a
word, the ways of representing and formulating the subject that makes it comprehensible for others” (Shulman,
1986, p. 9).
The research community has blurred the boundaries of PCK and reconceptualized it in a variety of ways
(Loughran, Mulhall & Berry, 2004; VanDriel, Verloop & DeVos, 1998). As a means to better identify “true
technology integration,” Pierson (2001) used the concept of PCK along with technology knowledge, which she
59
defined as “basic technology competency…[and] an understanding of the unique characteristics of particular types
of technologies that would lend themselves to particular aspects of the teaching and learning process” (p.427). She
characterized technological-pedagogical-content-knowledge as the intersection of knowledge in the areas of content,
pedagogy, technological and pedagogical content.
The authors of this paper feel that pre-service teachers need not only procedural, conceptual and
pedagogical content knowledge but also reflectivity and community development as well, specifically that related to
technology integration: pedagogical technology integration content knowledge (PTICK). PTICK contains five
dimensions: technical procedural knowledge (knowing and being able to operate the technology), technology
integration conceptual knowledge (theories behind effective uses of technology for teaching and learning),
pedagogical content knowledge (knowledge and ability to transform subject matter content for learners’ needs),
reflective knowledge (metacognitive abilities to reflect, problem-solve and learn from experiences), and community
knowledge (knowledge and ability to develop a community of learners in the classroom as well as participate in a
professional learning community.)
As part of this knowledge base, pre-service teachers should have cases or scenarios of exemplary
instructional products and solutions upon which to draw. Field experiences during the pre-service teachers’
education program often provide a context in which to apply these skills and to develop such scenarios. However,
these opportunities may not be available to students until later in their course work and opportunities to integrate
technology into their teaching may be limited by a variety of factors in their field placements.
One way to mediate the “theory to practice gap” and promote the development of PTICK is to infuse
teacher and technology courses with a problem-centered approach via cases. Problem-centered instruction
encompasses many forms: problem-based, case-based, action, project-based, question- or issue-based learning and
goal-based scenarios (Duffy & Cunningham, 1996; Jonassen, 2000). Almost 20 years ago, Shulman (1986)
advocated the use of cases in teacher education in order to develop pedagogical content knowledge. Merseth (1996)
has documented the trends of case-based pedagogy in teacher education programs. In pre-service teacher
preparation, cases have been used to teach pre-service teachers a variety of skills from adapting instruction for
limited English proficient students with disabilities (Andrews, 2002); to reflecting on instructional practices
through multi-media cases (Hewitt, Pedretti, Bencze, Vaillancourt & Yoon, 2003); to exploring biases and beliefs
related to race, gender and culture (Shulman, 1992); to developing formal and practical knowledge (Lundeberg &
Scheurman, 1997).
Guiding Question
How does analyzing cases affect pedagogical technology integration content knowledge?
60
one time per year (in the summer) and students move through the program in a cohort. Enrollment is limited to 50
students or less each year in each of the content fields. Currently, students may complete the alternative preparation
program in Language & Literacy, Mathematics, Science, Social Studies, or Middle Childhood Education. Additional
alternative preparation programs in Reading Education and Teaching English as a Second Language are currently
under review at the state level.
While in the past, the TEEMS program of study integrated technology throughout the students’ program of
study, pressure from within the College of Education, and from state and national professional organizations and
accrediting agencies urged the program faculty to consider a more standards-driven, assessment-oriented path. All
five TEEMS programs of study include a revised version of the IT 7360, Technology for Educators, course. The
course addresses the National Educational Technology Performance Profiles for Teachers as well as support all six
of the National Educational Technology Standards for Teachers (NETS-T) and contributes to student understanding
of the INTASC Standards. In addition, the course makes use of assessments that have been directly mapped to the
NETS-T standards. The effectiveness of the IT 7360 courses has been empirically documented in a number of
studies (see Dias & Shoffner, 2003; Shoffner & Dias, 2003; Shoffner, Dias, & Thomas, 2001; Shoffner & Dias,
2001; Shoffner, 2000). As TEEMs students are initial preparation graduate students, it was necessary to adapt the in-
service course, IT 7360, Technology for Educators, to meet their pre-service and content needs. The course and its
related resource laden WWW site, incorporates a problem-centered, activity-based approach where the technology is
anchored in authentic and familiar contexts in which teaching and learning occurs (Cognition and Technology
Group at Vanderbilt, 1991; Vygotsky, 1978). The online learning environment can be accessed at
http://msit.gsu.edu/IT/Teachers
While introducing and reinforcing technology integration skills, the focus of the course is teaching and
planning methods for the k-12 technology–enhanced learning environment. In the Technology for Educators course,
the technology is immersed in learning about what being an educator entails –planning, learning theory, instructional
strategies, classroom management, and assessment. Throughout the course, pre-service teachers demonstrate their
technology integration skills in a variety of activities which focus simultaneously on both what they can do with the
technology, personally, and their ability to plan for their students to meet curriculum requirements while making use
of a variety of technologies. Case studies were added to the course in fall 2004 in an effort to develop pedagogical
content knowledge and PTICK for the alternative teacher preparation students.
Methods
This research used mixed-methods within the context of an exploratory multi-case study. As suggested by
Yin (2003), the case study design is an appropriate way to investigate the causal links and the context relating to an
intervention. It is also useful when there is little or no control over the behavioral events. The units of analysis are
each of the three sections of IT 7360, Technology for Educators.
Data Collection
Data collection occurred throughout the six-week summer course. A variety of data sources were gathered
for the larger study including reflection papers, problem-based or case-based analyses, pre and post skill surveys,
three Technology Integration Planning and Skill samples (TIPS) (concept maps, webpage creation, and databases)
and course-end electronic portfolios. For this portion of the study, the researchers examined the case-based
analyses and the reflection papers of two of the three class sections.
One section of English Education students and the section of Science Education students analyzed cases
from Educational Technology in Action: Problem-based Exercises for Technology Integration (Roblyer, 2004).
Expectations for case discussions were provided and modeled prior to commencement of data collection as follows:
61
first, participants reviewed assigned cases and individually responded to specific questions at the end of each case
set. Next, they met online in teams of four to five to discuss the assigned cases. Each team then submitted a group
report based on their discussions. Finally, each student submitted an individual reflection on each case based on
initial responses and group output. In addition to one practice case, participants analyzed three more case sets during
the course. Except for the practice case (only formative feedback was provided), all others including the reflections
were scored and returned to the students.
As a control, the remaining English Education section analyzed the problem-based exercises from the same
text. Like the other sections, students were guided through a practice set of problem-based exercises. These were not
scored but feedback was provided. They were required to complete selected exercises from each chapter; however,
they did not discuss these in groups nor reflect on the experience. On a few occasions, the instructor led short (15-20
minute) in-class discussions about the assignments after they were graded.
Course reflection papers were collected and scored at the beginning, mid-point, and end of the course for
all sections. These reflection papers differed from the case analysis reflections. They contained five or six guiding
questions about course expectations, preparedness to use and integrate technology, beliefs about technology
integration and perceived learning gains.
Data Analysis
Data analysis for this paper included case-based analysis and reflection papers from students in one English
Education section and the Science Education section. Additional data analysis on remaining data sets is in process.
Researchers met bi-weekly and weekly to discuss analysis and to develop a common codebook. Survey data was
collected and analyzed in order to provide a snapshot of participants’ characteristics. The researchers used content
analysis (Merriam, 1988) to categorize concepts and ideas which students presented in their case analyses as well as
their reflection papers. With-in case analysis is currently underway. Once completed, the researchers will employ
cross-case analysis. Inter-rater reliability is being established and a common codebook has been developed thus far
for the case-based analysis and reflection papers. Additional codes will be added as the remaining data sets are
analyzed. Multiple data analysis strategies have been considered. Initially, the researchers considered analyzing data
chronologically in order of submissions throughout the course to determine participant trends. This strategy was put
aside when the researchers, in a weekly meeting, determined that some assignments sought different affects and/or
different cognitive tasks. The second analysis strategy was to focus on the case-based sections only at first, and to
analyze by artifact; namely all of the cases, then all of the reflection plans, then all of the lesson plans, then all of the
portfolios, still with-in case, by researcher, constantly updating the codebook via a database. TIPS samples
(technology-related products and technology integrated lesson plans) will be evaluated for technical skills aligned
with the National Educational Technology Standards for Teachers (NETS-T). Researchers scored course-end e-
portfolios, which contain unit and lesson plans, using a rubric based on NETS-T and will analyze them for evidence
of PTICK via content analysis (Merriam, 1988).
During each analysis phase, the researchers are examining for discrepant evidence and rival themes in order
to assure the rigor of the analysis. Member checking (Lincoln & Guba, 1985) is being used in order to verify the
data and validate the findings. Triangulation within and between data sources provide a holistic picture of the
phenomenon and provide corroborating evidence (Creswell, 1998) as findings emerge.
Preliminary Results
Several trends have emerged from data analysis thus far that address the question of how analyzing cases
affects pedagogical technology content knowledge. We have organized the preliminary results into three sections:
the two, which are called, TEEMS Science and TEEMS English, are derived from case analysis data; the third,
which is called Course Reflections, is derived from the three reflection papers (initial, mid-term, and final).
Predominate themes in each section are discussed in the following analysis.
TEEMS Science
Technology Integration in Future Practice
Initial data analysis reveals that preservice science teachers were beginning to think about and make
connections to current and future applications of technology integrated into instruction from the cases readings as
revealed by statements they made in their case reflections. JG’s comment is typical:
I think I would like to incorporate this type of inquiry-based learning into my future classroom. I
think it is important for students to apply what they learn and connect it to real world problems
(PR).
The preservice teachers’ statements were also embedded with personal opinions, views and varied reactions to the
62
cases as students began to see themselves in the position of the case study characters as illustrated by one preservice
teacher’s reaction:
I think Leroy is right to adopt the new [project-based science simulation] software. I did not think
it made much sense for Leroy to take the software to his district coordinator for her opinions.
Were I in Leroy’s position, I would only present my new software to an administrator if I expected
the school to pay for it (JG).
Initially, some participants were not able to make connections with the cases. For instance, DM indicated the
following:
The process of this case analysis did not help me develop my knowledge about teaching and
technology integration. It did not because I found the case very boring and terms were used that I
was not familiar with.
Merits and Limitations of Using Technology
There was a general agreement among the preservice science teachers that technology is beneficial for
student learning. As noted in the following reflection:
To help integrate technology into my classroom I will have online quizzes. Furthermore, I would
like for my kids to participate in Global Classroom [projects]. Global Classroom is [a website
where] classes form all around the world are brought together on the Internet. This will allow my
students to see how other students are learning the same information (DM).
However, they also recognized the limitations that technology and technology integration may present:
Potential problems that could arise might include that the software will be too difficult for my
students to use; I couldn’t get the resources that I need to use; [I wouldn’t have access to] the
software as frequently as I wanted to, or that the software wasn’t reliable. Also, the students could
get caught up in having fun with the program and loose sight of the academic purpose (SM).
In addition to the situations presented in the cases, the preservice teachers had personal experiences with technology
during their group discusses in chat rooms in which they expressed frustration with the technology and noted the
technology’s limitations for their own learning:
The chat function was difficult with 4 people. It was hard for people to respond to the question and then
respond to each other. If there was a pause, people thought that the discussion was over and tried to move
on while someone was writing something. Despite that, we did have a good discussion (LE).
Impact of Group Discussions
Ironically, while some preservice science teachers did not think that the individual case analysis process
was helpful and made comments in class that they thought that this assignment was “busy work”, most participants
indicated in their case reflections that analyzing and discussing the cases in groups was beneficial. For example PR
wrote:
Overall, I feel that our group chat went very well. We each came to the session with different ideas
and backgrounds, and listening to other people's points of view was very informative. I found that
everyone in my group thought about at least one aspect of the scenarios in a way that had not
occurred to me. I think that bouncing ideas off one another is a great way to increase our own
learning.
The discussions fostered a learning community that enabled the preservice teachers to confirm and challenge their
ideas and beliefs about teaching and technology.
PTICK Development
In the case responses, preservice science teachers displayed some aspects of PTICK, especially with regard
to content knowledge, pedagogical knowledge, and pedagogical content knowledge. However, integration of all
three concepts was not seamlessly evident in their responses. For example PD writes:
In model building, the students may gain a good idea of the physical descriptions of the plants
they’re studying, but this interactive software would help them to learn not only what the planets
look like, but what substances they are made of and what kinds of climates they have.
In this response, the participant directly addresses science related content with some understanding of model
building, which falls under pedagogical knowledge. LE also demonstrates the same when she writes:
Leroy sees that this [problem-based simulation] software will help his students learn scientific inquiry by
engaging them in an exciting, imaginary story. They will learn to solve problems and apply the skills
rather than learning a step by step method to approaching science problems…. Based on this scenario, I
would try to integrate technology particularly into areas that are difficult to teach, or in areas where
students don’t seem to be getting much out of the traditional methods.
The case analyses suggest that the preservice science teachers’ knowledge of using the software, pedagogical
63
approaches, as well pedagogical content knowledge is developing into PTICK.
TEEMS English
Technology Integration in Future Practice
The case analyses and case reflections enabled the preservice English teachers to imagine how they might
integrate technology and various instructional strategies into their lessons for future implementation as suggested in
the following reflection:
This scenario… was a perfect example of collaborative teaching across the curriculum. I would
love to tie in a unit plan with a history or social studies class; I think it would enhance the
students’ learning. Utilizing the Internet to help students to experience the country and the
language is the most logical way to layer the learning for students (MS).
Their commentary also revealed concern about keeping students and the lesson focused on subject matter content
and not allowing the technology to overshadow the curriculum objectives. While reflecting on case 3, LG
discussed plans to have her future students use the Internet for communication and research while creating writing
artifacts with publishing software. She stated:
“If I [were to] see that the daily work that students are doing continue to support my objectives,
then my technology integration is probably working. If I see that they are spending too much time
trying to lay out a brochure or build a website, then they are most likely not thinking too much
about what I am wanting them to really learn—the content of the lesson” (LG).
Merits and Limitations of Using Technology
As MS’s comments above about using the Internet and its resources “to layer the learning for students”
reveal, the participants generally thought that technology could be beneficial for student learning. This was also a
topic in their group discussions:
The group discussion regarding Mort and Chloe’s problem with publishing a literary magazine
brought up [the following question]…will [the publishing software] cause the students to like
poetry more? …One point we agreed on was that if the actual task of publishing the magazine is
fun, more students will want to be involved. As part of the assignment, students had to create their
own page using PageMaker. Again if this is fun and exciting for students, they will want to read
more poetry to find the “perfect poem.” As a result, the students may read more poetry overall
than they did in the past. We decided that although PageMaker directly can’t make students enjoy
poetry, it indirectly may have an impact (MS).
Although the preservice English teachers noted the merits of the technologies presented in their case studies, some
included cautionary comments such as those by EJ:
The solution [to use PageMaker to create the eighth-grade literacy magazine] certainly solves the
problem of ease and man-hours. It does not, however, guarantee that the students will like and
appreciate poetry. Technology is not going to do that. A good teacher will.
Reflecting on the case analysis further she added,
I think the teachers have to be careful not to get trapped into thinking that their uninterested
students are going to suddenly become literary geniuses because of a little technology (EJ).
Other issues such as access and students’ technology skills also surfaced:
My problems that I foresee [when integrating technology] come from the students’ fluency with
the technology and the readily availability of the technology so that students can get the most out
of the technology (CB).
These preservice teachers made plans for future technology integrated lessons; nevertheless, while
weighing the pros and cons of technology integration presented in the cases, they demonstrated more
concerns about technology and its use as the semester progressed and the main case story developed.
64
as I might have because I had sort of placed these characters in rather rigid roles which weren’t
conducive to exploring all the facets of the problem. Certainly in the area of solution building,
[the chat discussion] was extremely beneficial.
The chat discussions not only served to foster professional interactions, but also provided an avenue for reflection
and refinement of ideas as suggested in CB’s thoughts:
I think it has been this [Case 3] discussion and only this discussion that has broadened my mind to
the problems and advantages to technology integration. Just discussing these problems allowed
me to see flaws in my own ideas. I think one of the best ways to learn about technology
integration is talking with current and preservice teachers about their ideas. Not one of us should
go at this alone.
PTICK Development
Perhaps the most important aspect of instructional technology at the preservice level is the connections that
students are able to make between the use of technology tools, instructional strategies, content and community
building for the purpose of enriching the learning environment. Evidence typical of this early PTICK development
is seen in the following preservice English teacher’s case analysis reflection:
I really like the idea of integrating a pen pal into my classroom. I think the idea of teaching a
foreign experience along with the literature is a great idea. For example, in my [course-end
electronic portfolio], I am teaching The House on Mango Street and this pen pal creative activity
[described in the case study] could enhance my students’ learning experience because they can
learn about the Mexican community through students living in Mexico. Even though I can do a
webpage lesson on Mexican culture, I believe that students will be able to better understand their
heritage, culture, and ideals through a live interaction with students from that country (SG).
In this response, the preservice teacher demonstrates her PTICK development as documented in her growing
pedagogical content knowledge to teach literature, reflective knowledge as seen her application of ideas from the
cases to her own curriculum development, and community knowledge through developing information exchanges
between her students and those from a Mexican community.
Course Reflections
The reflection paper questions were intended to reveal preservice teachers’ beliefs about technology
integration and their ability to integrate technology into their content areas as well as issues related to the course
itself. Consistent with the questions that the preservice teachers answered, their responses reflected the following
themes: increase in self-efficacy and technology access concerns
Increase in Self-Efficacy
As the course progressed, the reflection papers indicated that most of the science and English preservice
teachers became more confident in their abilities to use and integrate technology as well as problem solve. AR, a
science education TEEMS student, demonstrated this transition between the initial and final reflection papers in the
following comments:
I came into this class with a low level of technical skills, having only used word, excel and
publisher…. As it is, I do not feel that I have mastered any of these technologies, and will still
need to spend a lot of time to develop my skills and increase my knowledge.
By the end of the semester he wrote:
I definitely feel more prepared to enter the classroom and use these technologies than I did
previously… (AR).
Some of the English preservice teachers also indicated that they were becoming more self-directed:
Because of this course, I am finding myself experimenting with software on my own to learn
more. This course has taught me to be self-sufficient and adapt my lessons around what is
available, and that is an essential tool (SG).
65
access is going to be limited….I have to go with my past experiences and I’d say that it is unlikely
that I will be able to implement even half of what I’ve learned (AM).
Thus, some participants tended towards an external locus of control as it related to having access to technology. On
the other hand, a couple of preservice teachers suggested that they would seek out positions where technology was
more plentiful:
It is difficult to predict what technology a school will have available. I think that this will be a
factor in my job search. In other words, all things being equal, I will choose a school with greater
technological resources available over one with less (LE).
Discussion
Early results suggest that as the participants are alternative teacher education preparation students, and
already possess a four-year degree in their content field, their content knowledge was high, as would be expected.
This led to falsely high self-efficacy in other areas, including technology knowledge, at least at first. However, the
initial reflection papers indicated that they did not feel capable of integrating technology. Evidence of reflective
knowledge was seen early on, as the use of cases was initially well received, but soon was viewed as busy work, as
students’ sense of self-efficacy diminished as they realized how much they didn’t know about PCK and PTICK at
mid-course. In particular, students resented that they had to analyze the cases individually prior to meeting for group
analysis, either due to lack of time due to a heavy course load, or due to lack of confidence in their responses.
However, post-group reflection indicates students felt they gained greatly from the group analysis of cases
(community knowledge). Their final reflection paper comments also documented growth in self-efficacy for
integrating technology and problem-solving. Preservice teachers’ perceptions of barriers to technology integration
such as access to technology and students’ technical skills was an unexpected finding in the course reflection papers
and case reflection commentaries. These concerns did not surface until later in the semester after the students had
engaged in the second of four case exercises. We attributed this to the implementation of cases. Anecdotal evidence
from students’ reflection papers in previous courses did not indicate this trend. The following elements of PTICK,
pedagogical content knowledge, reflective knowledge, and community knowledge, were evident in the preservice
teachers’ case analyses and reflections and thus demonstrate growth in these areas. The remaining PTICK elements,
technical procedural knowledge and technology integration conceptual knowledge, should be more evident in the
course-end portfolio, TIPS samples and their corresponding lesson planning documentation.
66
student reactions generated from the case analyses, PTICK development is a necessary component of teacher
preparation and a research focus worthy of increased attention.
References
Andrews, L. (2002). Preparing general education pre-service teachers for inclusion: Web-enhanced case-based
instruction. Journal of Special Education Technology, 17(3): 27-35.
Allinder, R.M. (1994). The relationship between efficacy and the instructional practices of special education
teachers and consultants. Teachers Education and Special Education, 17, 86-95.
Cognition and Technology Group at Vanderbilt. (1991). Technology and the design of generative learning
environments. Educational Technology, 31(5), 34-40.
Coladarci, T. (1992). Teachers’ sense of efficacy and commitment to teaching. Journal of Experimental Education,
60, 323-337.
Creswell, J. W. (1998). Qualitative inquiry and research design: Choosing among five traditions. Thousand Oaks,
CA: Sage.
Darling-Hammond, L., Chung, R., & Fellow, F. (2002). Variation in teacher preparation: How well do different
pathways prepare teachers to teach? Journal of TeacherEducation, 53(4), 286-302.
Dias, L.B., & Shoffner, M.B. (2003). Technology integration practices and beliefs among first-year middle grades
teachers. In D.A. Willis and J. Price (Eds.) Society for Information Technology and Teacher Education
Annual, Vol. 1 (pp. 2191-2198). Norfolk, VA: Association for the Advancement of Computing in
Education.
Duffy, T. M., & Cunningham, D. J. (1996). Constructivism: Implications for the design and delivery of instruction.
In D. Jonassen (Ed.), Handbook of Research for Educational Communications and Technology (pp. 170-
198). New York: Macmillian
Dryfus, H.L. & Dryfus, S. E. (1986). Mind over machine. New York: Free Press.
Evans, E.D. & Tribble, M.N. (1986). Perceived teaching problems, self-efficacy and commitment to teaching among
preservice teachers. Journal of Educational Research, 80(2), 81-85.
Ferstritzer, C.E. (1998). Alternative teacher certification – An overview. Washington D.C.: National Center for
Education Information.
Glickman, C. & Tamashiro, R. (1982). A comparison of first-year, fifth-year, and former teachers on efficacy, ego
development, and problem solving. Psychology in Schools, 19, 558-562.
Guskey, T.R. (1984). The influence of change in instructional effectiveness upon the affective characteristics of
teachers. The American Educational Research Journal, 21, 245-259.
Hewitt, J., Pedretti, E., Bencze, L., Vaillancourt, B.D., & Yoon, S. (2003). New applications for multimedia cases:
Promoting reflective practice in preservice teacher education. Journal of Technology and Teacher
Education, 11(4): 483-500.
Jonassen, D. H. (1997). Instructional design models for well-structured and ill-structured problem-solving learning
outcomes. Educational Technology Research and Development, 45(1): 65-95.
Jonassen, D. H. (2000). Revisiting activity theory as a framework for designing student-centered learning
environments. In D. H. Jonassen & S. M. Land (Eds.), Theoretical foundations of learning environments
(pp. 89-121). Mahwah, NJ: Lawrence Erlbaum Associates.
Lave, J., & Wenger, E. (1991). Situated learning: legitimate peripheral participation. Cambridge, MA: Cambridge
University Press.
Lincoln, Y. & Guba, E. (1985). Naturalistic inquiry. London: Sage Publishing.
Loughran, J., Mulhall, P. & Berry, A. (2004). In search of pedagogical content knowledge in science: Developing
ways of articulating and documenting professional practice. Journal of Research in Science Teaching,
41(4): 370-391.
Lundeberg, M.A. & Scheurman, G. (1997). Looking twice means seeing more: Developing pedagogical knowledge
through case analysis. Teaching and Teacher Education, 13(8): 783-797.
Merriam, S.B. (1998). Qualitative research and case study applications in education. San Francisco: Jossey-Bass.
Merseth, K. (1996). Cases and case method in teacher education. In J. Sikula, Handbook of Research on Teacher
Education (pp. 102-119). New York: Simon & Schuster, Macmillan.
National Council for Accreditation of Teacher Education (2002). Professional standards for the accreditation of
schools, colleges and departments of education. Washington, D.C.: NCATE. Retrieved July 31, 2005 from
http://www.ncate.org/documents/unit_stnds_2002.pdf
Parkay, F.W., Greenwood, G., Olejnik, S. & Proller, N. (1988). A study of the relationship among teacher efficacy,
locus of control, and stress. Journal of Research and Development in Education, 21(4), 13-22.
67
Pierson, M. (2001). Technology integration practice as a function of pedagogical expertise. Journal of Research on
Computing in Education 33(4): 413-430.
Raudenbush, S. Rowen, B., & Cheong, Y. (1992). Contextual effects on the self-perceived efficacy of high school
teachers. Sociology of Education, 65, 150-167.
Roblyer, M.D. (2004). Educational technology in action: Problem-based exercises for technology integration.
Upper Saddle River, NJ: Pearson Prentice Hall.
Shoffner, M.B., & Dias, L.B. (2003). Assessing inservice educator performance of state and national technology
standards. In D.A. Willis and J. Price (Eds.) Society for Information Technology and Teacher Education
Annual, Vol. 1 (pp. 902-909). Norfolk, VA: Association for the Advancement of Computing in Education.
Shoffner, M.B., & Dias, L.B. (2001). On-line support and portfolio assessment for NETS-T standards in preservice
programs at a large southeastern university. In M.J. Simonson (Ed.), 23rd Annual Proceedings: Selected
Research and Development Paper Presentations at the 2001 Convention of the Association for Educational
Communications and Technology, Volume 1 (pp. 372-380). Ames, IA: Iowa State University.
Shoffner, M.B., Dias, L.B. & Thomas, C.D. (2001). A model for collaborative relationships between instructional
technology and teacher education programs. Contemporary Issues in Technology and Teacher Education
[online serial], 1(3). Available: http://www.citejournal.org/vol1/iss3/currentissues/general/article1.htm
Shoffner, M.B. (2000). Using a resource-based learning environment to foster self-directedness in preservice
teachers. In M.J. Simonson (Ed.) 22nd Annual Proceedings: Selected Research and Development Paper
Presentations at the 2000 Convention of the Association for Educational Communications and Technology.
Ames, IA: Iowa State University.
Shulman, J.H. (1992). Tender feelings, hidden thoughts: Confronting bias, innocence, and racism through case
discussions. San Francisco, CA: Far West Laboratory for Educational Research and Development. (ERIC
Document Reproduction Services No. ED349306)
Shulman, L.S. & Shulman, J.H. (2004). How and what teachers learn: a shifting perspective. Journal of
Curriculum Studies, 36(2):257-271.
Shulman, L.S. (1986). Those who understand: Knowledge growth in teaching. Educational Researcher, 15(2): 4-14.
Shulman, L.S. (1987). Knowledge and teaching: Foundations of the new reform. Harvard Educational Review,
57(1):1-22.
VanDriel, J.H., Verloop, N. & DeVos, W. (1998). Developing science teachers’ pedagogical content knowledge.
Journal of Research in Science Teaching, 35(6): 673-695.
Vygotsky, L. (1978). Mind in society. Cambridge, MA: Harvard University.
Yin, R. (2003). Case study research: Design and methods (3rd Ed.). Thousand Oaks, CA: Sage Publishing.
68
Effect of Training Method on Performance and Computer-Self Efficacy
Lynette Brown
Tristan Johnson
Debra O’Connor
Mohammed Khalil
Xiaoxia Huang
Youngmin Lee
Miyoung Lee
Florida State University
Computer-based technology is rapidly becoming a part of the change that is taking place in the
organizational setting. According to Simon (2000), many companies have reported difficulty upgrading their
company's technology because of employee computer literacy problems. An important question for employers,
trainers and instructional designers is what interventions will effectively support learners as they adopt these
innovations in the performance of their jobs?
The success of end-user computing has been facilitated by factors such as commitment and regular use of
the computer. In this context, end user computing refers to the use of personal computers to perform tasks such as
word-processing in contrast to computer programming or design. According to Kay (1990), cognitive attitude,
awareness, and knowledge of application software were found to be the best predictors of commitment to the use of
computers. Regular use of computers could be influenced by the availability of hardware, software, and training;
however, personal willingness was a priority factor that related to a person using the computer effectively (Kay,
1990).
Explanations for the effective use of computers have also been drawn from Social Learning Theory
(Bandura, 1986). Social Learning Theory posits that self-efficacy, a belief in one's capability to perform certain
actions, is a major determinant of choice of activities, degree of effort, period of persistence, and level of
performance. Bandura (1986) emphasizes that individuals’ self-efficacy should be examined in light of specific
targets of accomplishment such as driving a car on a freeway, referred to as driving self-efficacy or in this instance
using the computer to perform tasks, referred to as computer self-efficacy (Compeau & Higgins, 1995).
Recent studies examining the integration of computer-based technology in the workplace (Cheney, Mann,
& Amoroso, 1986; Cronan & Douglas, 1990; Grover & Teng, 1994) identified training as a critical factor to the
success of end-user computing. Preparing the workforce to use computer technologies is a high-priority training
objective within organizations; yet, little evidence is available on the effectiveness of various approaches to
computer training (Gist, Schwoerer & Rosen, 1989). Simon (2000) indicates that the availability of a wide range of
training options and the lack of reliable indicators that predict trainee success compounds the computer literacy
problems. Training methods that provide good conceptual models have been identified as variables that could affect
the success or failure of end-user computing (Cheney, Mann & Amoroso, 1986; Santhanam & Sein, 1994).
A limited number of researchers have examined the effects of alternative training methods (Chou, 2001;
Simon, 2000; Mitchell, Hopper, Daniels, George-Falvy, & James, 1994; Gist, Rosen & Schwoerer, 1989) on end-
user computing. In a study examining self-efficacy and mastery of a computer software program, Gist, et al., (1989)
used the modeling method and a tutorial method to examine alternative training methods on self-efficacy and
mastery of a computer software program in the context of a field experiment involving 108 university managers.
The researchers used video modeling as the principle means of instruction in one group and tutorial training which
used a one-on-one interactive tutorial diskette (visual instruction on the computer monitor) that presented the same
concepts with very similar examples to those presented in the modeling group. The dependent variables were self-
efficacy and mastery of a computer software program. Although the participants in the tutorial were told what to do,
there was no modeling. At the end of the training, both groups were stopped and given the identical timed, objective
performance test. The behavior- modeling training participants performed better than the tutorial participants as
hypothesized; however, the training condition effect on computer self-efficacy was not significant across the groups
as was expected.
In another study examining the relationship of learning style and training method to end-user computer
satisfaction and computer use, Simon (2000) examined three training methods: 1) instruction, 2) exploration, and 3)
behavior modeling. Simon (2000) focused on determining the optimum method of training novice computer users.
69
The study involved four hundred members of the U. S. Navy. The results indicated that trainees whose learning style
matched training methodology were more successful in training outcomes, had higher computer satisfaction and had
higher levels of computer use. Participants in the behavior-modeling training method had the highest levels of
satisfaction and computer use. In yet another study, Chou (2001) compared the effects of training method and
computer anxiety on learner's computer self-efficacy and learning performance. Participants were 101 high school
students in 10th grade. Behavior-modeling and the instruction-based method were used. The instructor and content
was the same for both methods, providing continuity throughout all training sessions. The training was held in the
same room. Students were free to take notes and were encouraged to ask questions at any time during the
presentation. In the instruction-based condition, a lecture format was employed. In the behavior- modeling group
students watched the instructor demonstrating examples and executing corresponding step-by-step procedures on the
computer from their computer monitors. The computer-driven demonstration was the principal instructional media.
Several tasks were used to measure performance and the five-point Likert-type measure of computer self-efficacy
was employed in the study. The scale was administered before and after the experiment. The results confirmed with
earlier research on behavior-modeling, which found it superior to the instruction-based approaches on computer
learning performance and self-efficacy (Compeau and Higgins, 1995; Simon, & Werner, 1996).
Compeau and Higgins (1995) also used the modeling technique in a study aimed at understanding the
impact of a motivational construct, self-efficacy on individual reactions to computing technology. Self-efficacy was
found to play an important role in shaping individual's feelings and behaviors. Bandura (1986) indicates that self-
efficacy is derived from four information sources: guided mastery, behavior modeling, social persuasion, and
physiological states. The strongest source of information is guided mastery--actual experiences of success in dealing
with the behavior. The more successful interactions those individuals have with computers, the more likely they are
to develop high self-efficacy. Hands-on practice is a key component of training, so that people can build their
confidence along with their skill.
Behavior-modeling is a task-focused method that involves visual observation of the behaviors of a model
performing tasks. The learners then imitate and extend the model’s behavior in guided practice and exploration. The
behavior-modeling method employs a hands-on demonstration approach to introduce new information or techniques
followed by complimentary lecture (Chou, 2001; Simon, 2000;Simon and Werner, 1996; Gist, Schwoerer, & Rosen,
1989). Although behavior-modeling has been lauded as a successful training technique in several domains, e.g.,
supervisory skills training, interpersonal skill development and recently, computer related skills, researchers have
raised several concerns. Baldwin (1992) notes the paucity of recent research to improve or enhance the behavior
modeling components. Further, inconclusive results in attempts to assess outcomes (McGehee & Tullar, 1978;
Russell, Wexley, & Hunter, 1984) with respect to trainees ability to generalize modeled skills to settings outside the
training context, warrant further empirical research (Baldwin, 1992). Other instructional methods have also been
found to contribute to computer related learning.
The instruction-based method is widely accepted and understood and is commonly used in training and
educational settings. The method has also been referred to in the literature as traditional-instruction (Chou, 2001).
The method has been found to be effective for all types of learning outcomes and is distinguished by its lecture
format (Simon and Werner, 1996). Another instructional method that might be effective for computer-related skill
development is the direct instruction method. Direct instruction has its theoretical origins in the behavioral family,
particularly in the thinking of training and behavioral psychologist. Direct instruction is highly structured, teacher
directed and controlled and places highest priority on the assignment and completion of academic tasks. The teacher
explains a new concept or skill having the learner test their understanding by practicing under teacher guidance,
referred to as guided practice (Joyce & Weil, 1996; Rosenshine, 1995).
As noted earlier, several researchers have examined training methods with factors such as learning style,
self-efficacy, computer self-efficacy, computer anxiety, group-member modeling behavior and performance in
computer software training and suggest that further research is needed (Chou, 2001; Simon, 2000; Compeau &
Higgins, 1995; Gist, Schwoerer & Rosen, 1989). Computer self-efficacy, a motivational construct has been found to
be associated with attitudes and computer performance. Computer self-efficacy is a judgment of one's capability to
use computers in the accomplishment of a task (i.e., using a software package for data analysis) rather than
component skills such as formatting a diskettes or using a software feature to format a paragraph (Compeau &
Higgins, 1995). Positive results have been found in an examination of attitudes toward computer technology
(Delcourt & Kinzie, 1993), the early adoption of computer technology (Burkhardt & Brass, 1990), and enrollment in
a computer course (Hill, Smith & Mann, 1987). Positive results were also found in a study that examined training
performance in a computer course (Webster and Martocchio, 1992).
Several researchers have examined the predictor role of computer self-efficacy and found positive results.
Gist, Schwoerer & Rosen (1989) note that trainees with low self-efficacy might be expected to perform better in a
70
modeling training setting than in settings that constrain external attributions for training failures. For example, in
behavior- modeling learners may attribute their difficulties to the model’s rapid pace or the model’s failure to
provide sufficient explanation for each behavior. The varied approaches and results of the aforementioned research
highlights the relevance of the need for further research examining factors affecting end-user computing in order to
overcome computer literacy problems.
The purpose of the current study was to conduct an examination that focused on training methods,
performance and computer self-efficacy. Rather than focusing on implementation of a computer system as in the
Simon (2000) study, the current research examined the effect of training method on performance and computer self-
efficacy, utilizing software designed for the performance of specific job-related tasks for law enforcement officers.
Drawing from the results of the prior studies using the behavior-modeling method and the widespread use of the
instruction-based method formed the basis for selecting the behavior-modeling method and the instruction-based
method for examination in the current study. As in the Chou (2001) study, the materials were the same for both
groups with appropriate adjustments to allow for the modeling aspect in the behavior-modeling methods’ group.
The instruction-based method consisted primarily of lectures. Using PowerPoint slides, the instructor
described key features of the software and its functionality. For example, the instructor described how the drop-
down screen feature of the software could be used to select a motor vehicle type (i.e., truck vs. minivan). The
behavior-modeling method concentrated on the idea of observing and doing. It consisted of modeling, instruction
(lecture format for key learning points augmented with computer-visuals and handouts), exploring and feedback.
Exploring allowed the learner greater control during practice which supplemented the step-by step modeled
behavior of the trainer, which was then imitated step-by-step by the learner. The trainer provided learner feedback
during learner practice and exploration (Simon, 2000; Chou, 2001).
Performance was measured by scoring the results of a completed law enforcement report. At the end of
training, each trainee was given a written scenario describing a vehicle crash. The scenario was the same for all
trainees. Using the computer software, participants entered the details of the vehicle crash. The output data referred
to as a crash report was used to measure performance. Measures of computer self-efficacy were obtained at the
beginning and at the end of the training. To measure computer self-efficacy, trainees completed a questionnaire, the
Computer Self-Efficacy Scale (Murphy, Coover, & Owen, 1989).
Two primary outcomes were expected in this study. Similar to findings from Chou (2001), it was expected
that trainees' in the behavior-modeling method group of this study would perform better than the trainees in the
instruction-based method group would. Trainees in the behavior-modeling method group were expected to have a
greater change in their computer self-efficacy than the instruction-based group.
Method
Participants
Participants in the study were 20 law enforcement officers from several police agencies. All participants
were males, average age 38. The participants’ work schedule dictated which workshop they participated in. Some
were instructed by the supervisor to attend one of the two sessions, which resulted in some attending a session on
their scheduled day off, others attended voluntarily. Participants were asked to agree to use the software subsequent
to the training to complete their law enforcement reports and provide feedback to the software development team
regarding any problems experienced using the software. Additionally, the participants were asked to give
suggestions that would improve the utility of the software for law enforcement officers. Lastly, participants agreed
to complete a confidential pre and post-test questionnaire to measure computer self-efficacy and a scenario
performance test using the software at the end of the training.
Independent Variable
The independent variable in this study was the training method. Two training methods were employed;
behavior-modeling and instruction-based. In the behavior-modeling session, participants watched the trainer model
how to complete specific sections of the law enforcement report using the software on a laptop computer. The
trainer used a scenario of an incident that would require completion of each of the seven sections of the report and
require the use of the key features of the software. The trainer emphasized key learning points during the
demonstration of the steps necessary to complete each section of the report. After each section was demonstrated,
participants were allowed time to execute the steps. Each participant was provided an identical scenario of an
incident requiring the completion of the report section covered. Upon request, the trainer provided feedback to
individual participants at their laptop station. The trainer provided group feedback by projecting on a screen the
correct response for that section. The trainer then proceeded to the next section using the same format until all seven
sections were covered. After all sections were completed, participants were given time to explore (unguided
71
practice) the software features.
In the instruction-based method, the trainer used PowerPoint slides to display each section of the report and
to discuss the features of the software applicable to completing each section. Unlike the behavior-modeling method
where the instructor entered data in each section of the form to demonstrate how to use the features of the software
to complete that section, the PowerPoint slides contained pre-entered data in each section of the form. After the
lecture on each section, each participant was provided a scenario of an incident requiring the completion of the
report section covered. The participants were allowed time to execute the steps necessary to complete the given
section using their laptops. Upon request, the trainer provided feedback to individual participants at their laptop
station. The trainer provided group feedback by projecting a PowerPoint slide of the correct response for that
section. As in the behavior-modeling sessions, the participants were allowed time to practice using the software to
complete any section of the report presented during the lecture.
Dependent Variables
Two dependent variables were examined in this study, performance and computer self-efficacy. To
measure performance, the participants were given a written scenario of a vehicle crash that required completion of
an entire law enforcement report. Both groups were given the same scenario. The results of the completed report
were used to measure performance. To complete the report, participants were required to enter data such as vehicle
identification number, vehicle type, drivers' license number, and cause of crash. Various features of the software
were used to complete the information. For example, the software allowed the participant to select the vehicle type
from a drop down menu. Participants in both groups completed the report using their laptops. The report consisted
of 177 items. Each item was assigned equal weight with a value of one. A perfect score was 177. The performance
test was given during the last hour of the session, thereby allowing the participants one-hour to complete the report.
The participants were instructed to enter their ID numbers on the report and save the completed report on their
thumb drive. Each participant's report was saved from the thumb drive to the trainer's computer.
Computer self-efficacy was measured prior to and upon completion of the training in both the instruction-
based and behavior-modeling training classes. To measure computer self-efficacy, a 32-item, 5-point Likert scale
computer self-efficacy questionnaire (Murphy, Coover, & Owen, 1989) which has been used by other researchers
(Chou, 2001) was used without modification for both the pre and post computer self -efficacy measure. Participants
were not differentiated on the dimension of their pre-training computer self-efficacy.
At the beginning of training in both sessions, participants were allowed approximately 15 minutes to
complete the pre-training computer self-efficacy questionnaire. At the end of the session, participants were not timed
when completing the post-training computer self-efficacy measure. The questionnaire consisted of such statements
preceded by “I feel confident”: adding and deleting information from a data file; explaining why a program
(software) will or will not run on a given computer; troubleshooting computer problems; moving the cursor. The
highest attainable score of computer self-efficacy was 160 (32 items on the 5-point Likert scale per item). A higher
score indicated a higher computer self-efficacy rating. Efficacy levels were statistically analyzed for each group as
well as for each individual.
Procedure
Training sessions were offered on two separate days. One session employed the behavior-modeling method
and one session employed the instruction-based method. The trainer was the same for both sessions. The software
was installed on each participant's laptop computer prior to the start of each session. To maintain confidentiality,
participants were assigned an ID number to be used instead of their name.
At the beginning of both the behavior- modeling and the instruction -based sessions, participants completed
a demographic survey and a computer self-efficacy pretest. At the end of each session, participants completed the
computer self-efficacy posttest questionnaire and a performance test. Participants were assured that their
performance would only be used for research purposes and not reported to their employer.
The performance test was given during the last hour of the class, thereby allowing the participants one-hour
to complete the report. The participants were instructed to enter their ID numbers on the report and save the
competed report on their thumb drive. Each participant's report was saved from the thumb drive to the trainer's
computer. The reports were used later to measure their performance. Both groups were given training that was
identical in terms of the content but with variation in the training method used. Each of the sessions averaged 8
hours in duration.
Results
The data were analyzed using a one-way analysis of variance (ANOVA). Alpha was set at 0.5. The group
72
size was 10 participants per group; the probability of detecting a small difference between means was 0.1756.
Performance
Table 1 presents the means and standard deviations for each group on the performance test. Although the
behavior-modeling group performed slightly better than the instruction-based group, the results of the one-way
ANOVA revealed there was no significant difference in performance between the groups.
Table 1 Means and Standard Deviation of Performance Test Scores Across Groups
Computer Self-efficacy
The change in Computer self-efficacy for the behavior modeling and the instruction-based group was
examined. Paired scores for the pretest and posttest were used to compute the difference between the individuals'
pre- and post-training computer self-efficacy. The descriptive statistics presented in Table 2 indicate a pretest and
posttest change of 5.10 for the instruction-based group and 12.30 for the behavior-modeling group. The Analysis of
variance results test was performed to test the hypothesis that change in Computer self-efficacy will be greater for
participants in behavior modeling than in the instruction-based group. The ANOVA results yielded the F-value of
1.18, which corresponds to the p-value of 0.293. The results indicate no significant difference at α=0.05.
73
Discussion
The main purpose of the current study was to examine the effect of training method on performance and
computer self-efficacy. The most important finding from the present study is the finding of no significant difference.
Participants in the behavior-modeling method did not perform better than the instruction-based group. Therefore,
hypothesis 1 that participants in the behavior-modeling group would perform better than the instruction-based group
was not supported. This finding is not consistent with previous studies (Gist, Schwoerer & Rosen, 1989; Simon,
2000; Chou, 2001) that have indicated the behavior-modeling method results in better performance in computer
related training.
One important feature of the behavior-modeling method is that the learner is involved extensively during
the training with practical applications that are modeled. Also, extensive hands-on practice and feedback are
provided throughout with an emphasis on key learning points. The finding of no significant difference may be due to
homogeneity of the participants, participant's lack of interest in the training due to high prior computer experience
and the length of time of the training workshop. A review of the demographic data of the 20 participants revealed
that 13 of the participants when asked to rate their level of computer experience as either beginner, moderate or
advanced rated themselves as advanced computer users. Both groups performed at less than an 80% accuracy level
on the performance test. The low performance may be a function of the "ceiling effect," referring to participants not
feeling challenged to perform a task with which they were already proficient. Further, the measurement of
performance was limited to one task in contrast to measuring a range of performance over time as in the study
conducted by Chou (2001). Further research with experienced computer users might include the performance of a
variety of tasks with varied levels of complexity.
The second hypothesis in the current study was that participants in the behavior-modeling group would
have a greater change in their computer self-efficacy. Several researchers have examined the predictor role of
computer self-efficacy and found positive results. Gist, Schwoerer & Rosen (1989) noted that trainees with low self-
efficacy might be expected to perform better in a modeling training setting than in settings that constrain external
attributions for training failures. For example, in behavior modeling learners may attribute their difficulties to the
model’s rapid pace or the model’s failure to provide sufficient explanation for each behavior. Computer self-efficacy
has been examined as an independent variable in several studies and found to be a precursor to computer use (Chou,
2001, Gist, et al., 1989, Compeau and Higgins, 1995). The current study however, examined computer self-efficacy
as a dependent variable. Measures were taken at the beginning and end of training. Further research should examine
computer self-efficacy subsequent to completion of training at various intervals in the job performance setting.
The length of the training workshop might not have been sufficient, too short for some, or too long for
others to effect a significant change in computer self-efficacy. For example, participants might not have had
sufficient successful interactions or hands-on practice, which is a key component of training, so that people can
build their confidence along with their skill. Further, extending to computer self-efficacy from Bandura's (1986)
assertion that self-efficacy is derived from four information sources: guided mastery, behavior modeling, social
persuasion, and physiological states, the current examination omitted substantive considerations which should be
included in future research in this area.
One specific weakness of the present study includes the use of a small sample size (20). Further, multiple
measures of performance were not obtained and participants were not differentiated on the dimension of their pre-
training computer self-efficacy or prior computer experience. Additionally, only males participated in the current
study, both genders should be included in future research.
Several practical implications arise from the findings. Prior research has indicated that performance and
computer self-efficacy vary with training method (Chou, 2001; Simon, 2000). Further research is needed to
determine the most effective training method. For example, would comparisons of the performance of two
groups of trainees in two behavior modeling sessions, one that uses video-taped models and the other using live
model display vary? To what extent does variations within a method affect performance? For example, Gist,
Schwoerer & Rosen, (1989) used video models and Chou (2001) used live models. Positive performance results
were achieved in both studies, however, a deeper level examination of the method and its component parts
would help researchers determine whether the method is robust under different conditions. Additionally,
examining other methods such as direct instruction would offer further insights into the effectiveness of
alternative instructional methods to address computer literacy problems.
Lastly, in light of the growing use of teams in education, business and government settings, another fruitful
consideration is to examine the effects of learning as a team (Simon, 2000) and team performance. Teams are
using computer-related technology in multiple aspects of team dynamics.
74
References
Baldwin, T. T. (1992). Effects of Alternative Modeling strategies on outcomes of interpersonal-skills Training.
Journal of Applied Psychology, 77(2), 147-154.
Bandura, A. (1986). Social Foundations of thought and action. Englewood Cliffs, NJ: Prentice-Hall.
Burkhart, M., & Brass, D. (1990). Changing patterns of patterns changing: The effects of a change in technology
on social network structure and power. Administrative Science Quarterly, 35, 104-127
Cheney, P. H., Mann, R. I. &Amoroso, D. L.,(1986). Organizational factors affecting the success of end-user
computing. Journal of Management Information Systems, 3(1), 66-80.
Chou, H. W. (2001) Effects of training method and computer anxiety on learning performance and self-efficacy.
Computers in Human Behavior. 17 (1). 51-69.
Compeau, D. R. and Higgins, C. A. (1995). Application of social cognitive theory to training for computer skills.
Information Systems Research 5 (2), 118-143.
Cronan, T. & Douglas, D. E. (1990). End-user training and computer effectiveness in public agencies: An empirical
study. Journal of Management Information Systems, 6(1), 21-39.
Delcourt, M. A. & Kinzie, M. B. (1993). Computer technologies in teacher education: the measurement of attitudes
and self-efficacy. Journal of Research and Development in Education, 27(1), 35-41.
Gist, M. E., Schwoerer, C. & Rosen, B. (1989). Effects of alternative training methods on self-efficacy and
performance in computer software training. Journal of Applied Psychology. 74, (8), 884-891).
Grover, V. & Teng, T. C. (1994). Facilitating the implementation of customer-based inter-organizational systems:
An empirical analysis of innovation and support factors. Information Systems Journal 4(1), 61-89.
Hill, T., Smith, N. D., & Mann, M. F. (1987). Role of Efficacy expectations in predicting the decision to use
advanced technologies: the case of computers. Journal of Applied Psychology, 72(2), 307-313.
Joyce, B., & Weil, M. (1996). Models of teaching (5th ed.). Englewood Cliffs, NJ: Prentice-Hall.
Kay, R. (1990). Predicting Student Commitment to the Use of Computers. Journal of Educational Computing
Research, 6(3), 299-309.
McGehee, W, & Tullar, WL. (1978). A note on evaluating behavior modification and behavior modeling as
industrial training techniques. Personnel Psychology, 31 (3), 477-484.
Mitchell, T. R., Hooper, H., Daniels, D., George-Falvy, J., & James, L. R. (1994). Predicting self-efficacy and
performance during skill acquisition. Journal of Applied Psychology, 79 (4), 506-517.
Murphy, C. A., Coover, D. & Owen, S. V., (1989). Development and validation of the computer self-efficacy scale.
Educational Psychological Measurement, 49, 893-899.
Rosenshine, B. (1995). Advances in research on instruction. The Journal of Educational Research, 88 (5),
262-268.
Russell, J. S., Wexley, K. N., & Hunter, J. E. (1984). Questioning the effectiveness of behavior modeling training in
an industrial setting. Personnel Psychology, 37, 465-481.
Santhanam, R. & Sein, M. K., 1994. Improving end user proficiency effects of conceptual training and nature of
interaction. Information Systems Research, 5, 378-399.
Simon, S. J. (2000). The relationship of learning style and training method to end-user computer satisfaction and
computer use: A structural equation model. Information Technology, Learning, and Performance Journal,
18 (1) 41-59.
Simon, S. J., & Werner, J. M. (1996). Computer training through behavior modeling, self-paced, and instructional
approaches: a field experiment. Journal of Applied Psychology, 81(6), 648-659.
Webster, J. & Martocchio, J. J. (1992). Microcomputer playfulness: Development of measure with workplace
implications. MIS Quarterly, 16(2), 201-226.
75
Sense of Community in the Multicultural Online Learning Environments
Saniye Tugba Bulu
Sanser Bulu
Texas A&M University
Abstract
Building a sense of community among learners is necessary condition for both face-to-face and online
learning. With the increasing number of online courses in higher education, international students are sometimes
excluded from face-to-face interaction and they need to participate in online. The purpose of this study is to explore
how international students develop a sense of community in the multicultural online learning environments.
Introduction
With the development in the Internet and communication technologies, trend for distance learning in higher
education has been incredibly increasing. Distance learning is evolving from being a special form of education using
nontraditional delivery systems to providing an important conceptual framework for mainstream education (McIsaac
& Gunawardena, 1996). Recently, much of the attention has been paid to the use of computer-mediated
communication (CMC) to facilitate teaching and learning in the online courses. As well as supporting individualized
learning, CMC, especially asynchronous communication technologies such as discussion forums, email, and bulletin
boards, have potential to support teamwork among distance learners (Benbunan-Fich & Hiltz, 1999; Harasim, Hiltz,
Teles, & Turoff, 1995). Because of their flexible and independent feature, they are important medium for creating
collaborative and cooperative online learning environments (McIsacc & Gunawardena, 1996).
Interaction between a social environment and an individual has always been emphasized as a critical factor
to facilitate meaningful learning (Dewey, 1916; Vygotsky, 1978). Building a sense of community among learners is
necessary condition for both face-to-face and online learning. Recently, a sense of community in online learning and
dynamics of online communities have become important topics. McMillan and Chavis (1986) defined a sense of
community as “a feeling that members have of belonging, a feeling that members matter to one another and to the
group, and a shared faith that members’ needs will be met through their commitment to be together” (p. 9).
76
Purpose of the study
This study is designed to explore the sense of community in multicultural online learning environment.
This study aims to examine how international students develop the sense of community in online learning
environment by addressing the following question:
How do international students develop sense of community in multicultural online learning environment?
77
Face-to-face setting sInternational students have both academic and social challenges in the face-to-face
settings which consequently make them unable to think about themselves and affect development of their sense of
community.
Adjusting to different values is one of the main challenges that affect international students. Hofsteded
(1997) outlined five dimensions on that national cultures vary. These are individualism vs. collectivism, femininity
vs. masculinity, long-term vs. short-term orientation in life, power distance, and uncertainty avoidance. Robinson
(1992) pointed out some values generally attributed to American academic culture including individualism and
competition, equality and informality, pragmatism and reasoning style, philosophy of knowledge and knowledge
ownership. Similar values also emerged as the findings of Parker’s study with Taiwanese students (1999). He found
that most of the students were not used to have critical thinking skills and they were more task oriented.
Additionally, he found that many of the students who come from cultures where it is not appropriate to develop
relationship with faculty were uncomfortable to ask repeated question even if they don’t understand. He finally
stated that many international students faced with culture shock because of their different academic and social
expectations.
Another challenge that affects international students is the language. It is found that many international
students have difficulty when listening and following the extended lectures (McKnight, 1994; Parker, 1999).
Moreover, even they understand the spoken English; they still have difficulty asking questions because of their
language limitations.
Online settings There has been body of research that shows advantages of cross-cultural group process in
the online settings. Gunawardena et al. (2001) examined the online group process differences between Mexico and
American participants. Language, forms of language used, power distance in communication between teachers and
students, collectivist and individualistic tendencies are found as factors that affect online group process. Moreover,
she found that Mexican participants feel that CMC equalized statuses differences, American participants were more
concerned about the lack of verbal cues. Ziegahn (2001) also examined how the online nature of course influenced
students’ reflection in multicultural settings. Because of the asynchronous communication, both students and
instructors had access to written intellectual and emotional connections and students gave response to others, shared
their experiences, and expressed new ideas. Warschauer (1996) studied differences between international students’
face-to-face and online discussions. He found that there is more equal participation among foreign language students
in the online settings than the face-to-face discussions.
There are also studies that show disadvantages of online cross-cultural group process. Wilson (2001) found
that worldview, culturally specific vocabulary and concepts, linguistic characteristics of learner, and learner
motivation are the main obstacles that students face in multicultural online learning. Ku and Lohr (2003) also found
that international students have challenges in online settings because of their cultural differences in values, language
barriers, and learning format preferences. It is suggested that courses should be designed in such a way that provide
support from the instructor, offer appropriate strategies to assist international students, give resources and support
for their language problems, assist their participation.
Methodology
Aim of the researchers was to have clear understanding of how individuals build the sense of community in
the multicultural online learning environments. Therefore, in this study, qualitative research methods were employed
to collect and analyze the data as qualitative studies support that meanings of participants’ action can be only
grasped in relation to context in which they are involved (Merriam, 1998). The researchers conducted semi-
structured interviews with participants from different cultures about their experiences in the online courses.
Participants
The study looks at the sense of community from the perceptions of three international doctoral students at a
large South American University: Chinese, Mexican, and Indian. When selecting participants, purposive sampling
method was employed. All participants were pretty much experienced in online learning. Mexican participant has
taken 3, Chinese participant has taken 6, and Indian participant has taken 7 online courses.
Instrumentation
The main data collection instrument was the semi-structured interview protocol. As stated in literature
review section, Rovai (2002) offered four essential components for the sense of community in classroom setting as
spirit, trust, interaction, and learning. The interview questions were organized according to these components. The
interview protocol began with the background questions including participant's demographics and previous online
78
learning experiences. Other sections of the protocol included the questions about spirit, trust, interaction, and
learning.
Research Findings
Three findings emerged from this study (Figure 1). First, cultural differences resulted in cultural challenges
which make difficult to build the sense of community in both multicultural face-to-face and online learning settings.
Second, international students faced more cultural challenges in the face-to-face than the online settings which made
more difficult to build the sense of community. International students were able to solve their problems easier in the
online settings through attributes of CMC including flexibility of time and written communication. Third, in addition
to the attributes of CMC, some other factors such as peer behaviors, instructor behaviors, and contextual
characteristics in online learning also affected their sense of community level.
Telling styles
All participants expressed that there are differences between their cultures and American culture in terms of
their telling styles. Indian participant stated that different telling style affected her interaction with others in the
community. She expressed that she was not comfortable and confident when communicating with Americans
because she is culturally different. She said, “It is very easy for them [Americans] to say no or yes, but for me,
coming from different culture, I hesitate to say yes or no.” And she added,
In my country whenever you say anything no to any person, you think you are offending him and you are
being rude if you say no. But I learned that Americans, they really don’t care about those touchy things. Most of
them are very clear about what they want to do.
Chinese participant also told about her telling style and said,
79
Usually, I don’t want to have complaint with my peers, and most of time, I am pretty easy going, maybe that would
be affecting my online course. I don’t attack any people’s idea, really I won’t criticize people’s posting. If I really
want, I would use very polite way.
Contrary to other participants, Mexican participant saw her telling style more directed than Americans. She
stated, “as Mexican, that is not rude. I think Americans start something like ‘ok, it is a great piece of paper, but I
think you should do…’ Even if they don’t think it is a good piece of paper.”
Different values
Indian and Mexican participants stated that having different values in their cultures affected their
interaction in the face-to-face and the online settings. Indian participant stated,
For example, in a face-to-face course, we were talking about homosexuality and all that thing which we are
not really discussed much in India. So it was my culture, that never talking about these things really inhibits me
talking about this thing. They are kind of taboo in my country, so nobody really talk about gays and lesbians all like
that, it was all new to me. In United States, people like open to different things. That was kind of experience that I
feel uncomfortable to contribute.
Mexican participant said,
Sometimes you don’t agree, and because my culture is different, I don’t agree with what American things. I don’t
tell. I hold myself, I don’t want to talk. Or sometimes, if I know the people, I can express an opinion, but if only I
know whom I talking to.
Language Problems
Language was one of the main factors that affected participants’ interaction and participation in the face-to-
face settings. Indian participant stated, “Coming from different culture and all that, I had my own inhibition to talk
in front of many Americans [in face-to-face courses], I was so conscious about my language, make some mistakes,
80
so I am not very confident.”
Participants also found difficult to express their ideas in the face-to-face settings because they don’t think
in advance. They also stated that Americans are more active and dominant in the face-to-face classrooms; therefore
they don’t feel confident in such situation. Mexican participant said, “Sometimes in a face-to-face class, you cannot
speak. There is a whole session that maybe you don’t speak, because even if you have an idea, it is very hard to
express that idea.” However, in online environment, because of the written and asynchronous communication, they
could participate more and felt included. Chinese participant expressed,
I participate more in online course than face-to-face. Because I have more chance to express my own idea.
But face-to-face course, American students are more active, and they can think about their idea and express it faster
than me, so sometimes when I get ready to express my idea, the instructor has already finished the discussion. There
is no way for me to go back to the discussion. But in online setting, I can go back to discussion whenever alone, and
read my friends classmates posting again and again, and express my idea.
Moreover, Mexican participant stated, “Online courses, all the content is written, so it is easier. I can read
it, look at the dictionary, or engage more. If you are not English speaker, it is easier.” Indian participant said, “When
you speak, you don’t really have much time to think but when you write [in online learning] you really think since
you have time and reword it and all that. I think that really helps a lot.”
Moreover, requirement for expressing ideas in online learning also made them more active in the online
settings. Because everybody had to say their opinions and post their ideas, they stated that not only one person is
dominant. Chinese participant said, “If we did face-to-face setting, sometimes the American students would be
dominant the whole group study. But in online environment everybody has to say their opinion, and everybody has
to post their ideas.”
Moreover, Mexican participant stated, “When you discuss, you have to post some numbers of message. In
discussion your names appear there. And sometimes, you need to answer some people, answer following the people,
pick one and answer, and you answer whoever you want.”
Language problems also made communication difficult and created misunderstandings among international
students in the face-to-face settings. Mexican participant said,
Sometimes I was not able to understand Korean girl, because my English was not very good, and neither
was her. So when we were talking, sometimes it was harder to understand each other. But when you write, it was
easier. So, there is no accent.
Language problems also affected their friendship and interaction. Indian participant expressed that making
friendship with Americans were difficult at first because of the terms and jargons that they use. This made her not
interact with others and feel left out from the community. In addition, Chinese participant stated that Americans,
including both classmates and instructor, make more jokes in the face-to-face settings than the online settings. This
makes her feel isolated and not belong to the community. However, she didn’t feel this isolation in the online
settings. She said, “my classmates come from different culture; sometimes I cannot understand one hundred percent.
But, online, Americans try to make their opinion easier to understand. They understand, if they make jokes,
international students don’t understand. So they don’t make very jokes.”
Participants also stated that written communication in online learning is much easier than listening in the
face-to-face classes. It affected their involvement in the community and their quality of learning. Mexican
participant said, “In face-to-face courses, you have to listen, and that is harder. Online courses, content is written, so
it is easier. I can read, look at the dictionary, and engage more. If you are not English speaker, it is easier.” Indian
participant talked about her quality of learning, “[in online learning] it is not just to learn, we just have to apply
them, and try to respond other people. It is like much more quality than what we should do in face-to-face.”
Moreover, Mexican participant said, “I always like to read other’s posting, that gives you a different perspective,
another understanding. More people can think more meaningful in online courses. In online discussions, you read
others opinion; maybe you missed the main part of the article.” In addition, Chinese participant said, “I got chance
to read other peoples postings, I think that is a very good way for me to understand the subject, the reading the
posting of my peers.”
Time constrain
Limited time was another factor in the face-to-face settings which affected international students’
interaction and participation. Participants found their interaction limited in the face-to-face settings because of the
time constrains. Indian participant say,
Sometimes we have 20 students in the class, and if every student responds to it, we might all have to defense it, so it
is not everybody gets chance to speak, people are willing to share information, but there are so many people who
want to share the information.
81
However, in the online settings, they felt more freedom than limitation to express their ideas. Moreover,
they participated more because of the asynchronous interaction and written communication. Therefore, they did not
feel their interaction limited in the online settings.
Participants stated that limited time in the face-to-face courses also affected their detail of discussions and
socialization. Indian participant found easy to communicate in detail, exchange information, and sharing experiences
of different cultures in the online courses that made her connected to the community. Moreover, she felt that dealing
with course content in a limited time in the face-to-face courses hinder to make friendship. However, in the online
settings, she felt more flexible to make friendship and involved in the community. She said,
Usually in our college, we have class in the night, so everybody is in a hurry to leave as soon as the class is over.
You don’t have time to socialize or spend sometime, or then just go around or anything. But here it is up to us, in
online courses, there is not something like a time constrain.
Opposing to Indian participant, Chinese participant expressed that they didn’t talk much about culture in
both face-to-face and online settings, therefore she found socialization limited.
Mexican and Indian participants found hard to get to know people in the face-to-face settings which
affected their socialization. However, in the online settings, they could be able to understand individuals’ point of
view better because of the written communication and description of people. This built more strong social
relationships among them, makes them feel safe and not hinder them to communicate with others. Mexican
participant said, “I think it [social relationship] is easier in [online] some way because you get to know the people
because of that they are writing. Sometimes for example, if I need to answer someone, I know who I choose to
answer. Because I know more or less how they think.” Moreover, Indian participant stated “With the kind of and the
way of the responses that given on, you’ll get to know what kind of a personality that someone has.”
Peer behaviors Support from other peers. All participants agreed in that support from peers makes them more
confident and feel cared in the online settings. When they felt difficulty to understand something and had problems,
peer support made them encouraged and felt belong to the community. Chinese participant said, “If you really want
to learn, you can have a lot of interaction with classmates and classmates can help you to understand the contents of
the course.” Moreover, Indian participant expressed,
Sometimes when I was little lost, I used to go online, and then suddenly I see somebody else also is looking
there. They say I am confused and I don’t know what I am doing so. We shared a lot of information in a chat or
something like that.
When explaining the best experience, Mexican participant stated that having leader in the project made her more
confident in online learning. As she had not so much experience, peer support made her feel more comfortable.
Indian participant also stated similar opinion in terms of helping experienced peers to novice peers. She stated, “In
online environment, I have seen that, when somebody is have taken 6 courses in online environment, is willing the
82
help the person who is new to it. Experts are always willing to help the novice people.”
Interest of Americans in their culture. When Americans showed their interest to other cultures, international
participants felt more comfortable in the online settings. Americans interest made them feel close to each other.
Indian participant said, “I have team in almost all my online courses, students were American, they showed some
kind of interest to different culture, they were really open to learning new things and different perspectives. They
have really encouraged me in several ways.”
Feedback. Feedback from the peers also affected their sense of community level in the online settings and
participants were encouraged more when they got positive feedback. It made them feel more self-confident and
recognized from others. However, lack of feedback from others made them feel not cared. Chinese participant said,
In some courses, they always really care about other people. All of my classmates were very active. I got a lot of
feedback. It makes me feel like I have something, I can do better. But, in some courses, I don’t get feedback, and ok
I answered the questions, and I have finished the homework.
Instructor behaviors Instructor behaviors also affected participants’ sense of community in online learning.
Feedback. First instructor behavior was giving feedback. Participants stated that instructor feedback made them feel
cared. However, lack of instructor feedback made them feel lost, confused, and overwhelmed. Mexican participant
said, “I can communicate with instructor by mail, and I get immediate response, but in some courses, no body pays
attention.” Moreover, Indian participant said, “in one course, there was no feedback from the professor, we were
lost, don’t know where we were leading, it was really confusing. That was my worst experience; I understand how
much professor feedback would make change in your course.’ Chinese and Mexican participants also stated
importance of instructor support. Mexican participant said, “I think we have a lot of support from the program, as an
international, I feel a lot of support even from the teacher….I have a lot of support from the teacher, and I feel
confident.” Moreover, Chinese participant told,
The instructors understand the culture, diversity pretty well. I think they are very careful to contact to foreign
students. They are more patient and they wanted to be easier to be understood. I think, teacher they will use easier
language to communicate with me to compare American students.
Contextual characteristics Participants stated that there were some differences between the courses that they
participated in the online settings. Therefore, it was obvious that, the sense of community in online learning highly
depends on the structure of the course, type of activities, course content, and course requirements.
Structure of the course in online courses. Structure of the course and type of activities that instructor provided
affected the sense of community a lot in the online settings. If participants felt free to bring everything to the
discussion other than textbook, they engaged more in social interaction and talked about their culture. Therefore
their sense of community level became high.
Only focusing on the expectations of instructor and task oriented interaction made friendship harder in the
online courses. If they could not bring cultural and social stuff into their discussion, their sense of community level
was low. Chinese participant gave an example for the course which only focus on task and said,
You have to be online for to spend sometime during the course. So everybody just talk about the books we have
already read and the question we need to think. But you did not know personal questions. If you only talk about
academic things, it is very hard to become very close friends.
Format of the discussion also affected participants’ sense of community level in the online settings.
Participants stated that unguided discussions and poor messages affected their learning and encouragement in the
community. Indian participant said, “Sometimes, in some courses there are some participants like who just do the
job just sake of doing. They just post the posting at the end of the discussion. They don’t really participate in the real
sense.” Moreover, Mexican participant stated, “In online discussion, if I find same message in five places that really
upset me. Discussion are important tools in distance education, they have to be guided. Someone should make
comment, teacher or someone.”
Type of activities in online courses. If types of activities in the online courses were more dynamic, participants felt
more included. Chinese participant found some online courses more interacting than the face-to-face courses
because of the types of activities. However, not involving interacting activities in the online settings made them feel
isolated and not feel confident. Mexican participant said, “In the face-to-face, you go to class, and you write a paper.
That is an individual work. But in online, [when you work individually], you are isolated also from the others.”
83
Summary and Discussion
The purpose of the study was to examine international students’ sense of community in the multicultural
online learning environments. The results of the study revealed that learning perspectives, learning styles, telling
styles, individual or group working preferences, commitments, goals, and values were the major cultural differences
among participants. The results obviously showed that cultural differences affected participants’ sense of
community level in both multicultural face-to-face and online learning environment. This result showed
corresponding patterns to the literature. As stated by Parker (1999), international students have difficulties because
of their individualistic vs. collectivist orientations, lack of critical thinking skills and social relationships. Moreover,
the result is also consistent with the findings of Gunawardena et al. (2001) who found that forms of language used,
power distance in communication between teachers and students, collectivist and individualistic tendencies are main
factors that affect online cross-cultural group process.
The findings of the study also showed that international students faced more cultural challenges in the face-
to-face than the online settings. The results presented that language problems of international students affected their
interaction, participation, expression of their ideas, communication, friendship, and quality of their learning in the
face-to-face settings. This result is consistent with the findings of McKnight (1994) and Parker (1999) that language
affects international students’ interactions; they challenged listening extended classes, and have difficulty to ask
questions, express their opinions.
However, it was evident that they were able to solve their problems easier and develop better sense of
community in the online settings through the attributes of CMC including flexibility of time and written
communication. The results revealed that international students interacted and participated more, and changed
information in the online settings because of the time flexibility and written communication. As Gunawardena et al.
(2001) and Warschauer (1996) found that, there is equal participation among international students and Americans
in the online settings than the face-to-face settings. Therefore, international students feel more connected to the
community in the online settings and friendship and bonding developed among them. This result is also supported
by the research of Salmon (2000) and Soller (2001) who found that active participation, sharing ideas, and
promotive interaction are the important characteristics of effective online collaboration and community.
Because of the limited time in the face-to-face settings, international students faced difficulties to interact
with others and participate in the community. However, in the online settings, they felt more freedom than limitation
to express their ideas and participate more which makes them feel included in the community. Participants also have
difficulties when finding time for socialization in the face-to-face settings. However, in the online settings, they
could be able to understand individuals’ point of view better because of the written communication and description
of people which made them socialize and develop friendship with others. This result is consistent with the findings
of Ziegahn (2001) who found that asynchronous communication make students and instructors access to written
intellectual and emotional connections. As McMillan (1996) proposed that emotional bonding depends on high
quality of interaction and sharedness of events.
Participants also faced with problems when trusting others in the face-to-face because of the unplanned,
less structure, and oral communication characteristics of those settings. However, they were more comfortable to
trust others in the online settings as they were more systematic and gave written promises.
There were some contradictions in the opinions of participants about the online courses. It is evident that
those differences were result from the other factors such as peer behaviors, instructor behaviors, and contextual
characteristics. Support and feedback from other peers and interest of Americans in their culture made participants
more comfortable in the online settings. They developed more friendship and trust others. As stated in the literature,
feedback giving and getting, providing technical and task related support, and sharing personal anecdotes help
students develop online social presence and successful collaborative group process (Stacey, 1999; Vrasidas &
McIsaac, 1999). Instructor feedback and support were also affected participants’ sense of community in online
learning which made them feel cared. This result is consistent with the literature in that support from the instructor
and appropriate strategies to assist international students are important (Ku & Lohr, 2003; Wegerif, 1998). Finally, it
was found that the sense of community in online learning highly depends on the structure of the course, type of
activities, course content, and course requirements. This result is consistent with the findings of Vrasidas and
McIsaac (1999) and Wegerif (1998) who found that structure of the course and course design influence interaction
and the sense of community in the online courses.
Limitations
First of all, generalizations can not be made based on the experiences of three people from three cultures.
Their responses might be affected by their demographic characteristics.
84
Moreover, as participants were international students, researchers had sometimes difficulties to understand
their accent over tape during transcribing. Therefore, to ensure the validity and reliability during data analysis, the
researchers did member check when transcribing the data.
As Peshkin (1988) mentioned that subjectivity is inevitable component of the research. It is the unique
contribution that makes researcher distinctive while combining personal qualities and data. Since the researchers
were also international students, they were aware that they might bring their bias to the research; they might value
the behaviors and ideas of people similar to their culture, and ignore the others. Researchers kept in mind these
issues and they systematically seek their subjectivity during the whole research process.
85
Cultural Differences Cultural Challenges
Flexibility of time
Written
communication
Low sense of
community in the
online settings
Figure 1: Findings
References
Benbunan-Fich, R., & Hiltz, S. R. (1999, March). Educational applications of CMCS: Solving case studies
through asynchronous learning networks. Journal of Computer Mediated Communication, 4(3).
Retrieved April 10, 2004, from http://www.ascusc.org/jcmc/vol4/issue3/benbunan-fich.html
Crisp, C. B., & Jarvenpaa, S. L. (2000). Trust over time in global virtual teams. Paper presented at the
Academy of Management Meeting, Toronto.
Culnan, M. J.,& Markus,M. L. (1987). Information technologies. In F. Jablin, L. L. Putnam, K. Roberts, &
L. Porter (Eds.), Handbook of organizational communication (pp. 420-443). Newbury Park, CA:
Sage.
Daft, R. L. & R. H. Lengel (1984). Information richness: a new approach to managerial behavior and
organizational design. Research in Organizational Behavior, 6, 191-233.
Dewey, J. (1916). Democracy and Education. The Macmillan Company.
Gunawardena, C. N., Nolla, A. C., Wilson, P.L., López-Islas, J. R., Ramírez-Angel, N., Megchun-Alpízar,
R. M. (2001). A cross-cultural study of group process and development in online conferences, Distance
Education, 22(1), 85-121.
Harasim, L., (1990). Online education: An environment for collaboration and intellectual amplification. In
L. Harasim, ed., Online Education: Perspectives on a New Environment (39- 66). New York:
Praeger Publishers.
Harasim, L., Hiltz, S.R., Teles, L., & Turoff, M. (1995). Learning networks: A field guide to teaching and
learning online. Cambridge, MA: MIT Press.
Hofstede, G. (1997). Cultures and organizations: Software of the mind. New York: McGraw-Hill.
Ku, H. & Lohr, L. L. (2003). A Case Study of Chinese Students’ Attitudes Toward Their First Online
Learning Experience. Educational Technology Research and Development, 51(3), 95-102.
McDonald, J. (1998). Interpersonal aspects of group dynamics and development in computer conferencing.
Unpublished Dissertation, University of Wisconsin - Madison, Madison, Wisconsin.
McIsaac, M. S., & Gunawardena, C. N. (1996). Distance education. In D. H. Jonassen (Ed.), Handbook of
research for educational communications and technology: A project of the Association for
Educational Communications and Technology (pp. 403-437). New York: Simon & Schuster
Macmillan.
McKnight, A. (1994). The business of listening at university. Paper presented at the Annual Meeting of the
Teachers of English to Speakers of Other Languages, Baltimore, MD. (ERIC Document
Reproduction Services No: ED374663).
McMillan, D.W. (1996). Sense of community. Journal of Community Psychology, 24(4), 315-325.
McMillan, D.W. & Chavis, D.M. (1986). Sense of community: A definition and theory. American Journal
of Community Psychology, 14(1), 6-23.
Merriam, S. B. (1998). Qualitative research and case study applications in education. San Francisco:
Jossey-Bass.
Parker, R. (1999). Teaching, learning, and working with international students: A case study. Paper
presented at the Annual Meeting of the Mid-South Educational Research Association. (ERIC
Document Reproduction Services No: ED438756).
Peshkin, A. (1988) In search of subjectivity–one’s own. Educational Researcher, 17(7), 17-21.
Robinson, J. (1992).International students and American university culture: Adjustment issue. (Eric
Document Reproduction Services No: ED350968)
Rovai, A. P. (2002). A preliminary look at the structural differences of higher education classroom
communities in traditional and ALN courses. Journal of Asynchronous Learning Networks, 6(1),
41-56.
Salmon, G. (2000) E-moderating: the key to teaching and learning online. London: Kogan Page.
Short, J., E. Williams, & B. Christie (1976). The social psychology of telecommunications. New York:
Wiley.
Soller, A.L. (2001) Supporting Social Interaction in an Intelligent Collaborative Learning System.
International Journal of Artificial Intelligence in Education, 12(1), 40-62.
Sproull, L. and S. Kiesler (1986). Reducing social context cues: Electronic mail in organizational
communication. Management Science, 32(11), 1492-1513.
Stacey, E. (1999). Collaborative learning in an online environment. Journal of Distance Education, 14(2),
14-33.
Vrasidas, C. & McIsaac, M.S. (1999). Factors influencing interaction in an online course. The American
Journal of Distance Education. 13 (3) 22-36.
Vygotsky, L. S. (1978). Mind in society: The development of higher psychological processes. Cambridge,
MA: Harvard University Press.
Walther, J. B. (1992). Interpersonal effects in computer-mediated interaction: A relational perspective.
Communication Research, 19 (1), 52-90.
Walther, J. B. (1993). Impression development in computer-mediated interaction. Western Journal of
Communication, 57, 381-398.
Warschauer, M. (1996). Comparing face-to-face and electronic discussion in the second language
classroom. CALICO Journal, 13(2), 7-26.
Wegerif, R. (1998). The social dimension of asynchronous learning networks. Journal of Asynchronous
Learning Networks, 2(1), 34-49.
Wilson, M. S. (2001). Cultural considerations in online instruction and learning. Distance Education, 22(1),
52-64.
Ziegahn, L. (2001). 'Talk' about culture online: The potential for transformation. Distance Education, 22(1),
144-150.Chaos Theory and the Sciences of Complexity:
88
Facilitators' Perception of Interactions in an Online Learning Program
Hasan Çalışkan
Anadolu University
Eskişehir, Türkiye 26470
Schools and colleges all around the world have started making use of advanced technology to
provide learners effective, efficient and adequate instruction. The use of Internet and Web for learning and
teaching has caused many online courses to be offered when teaching-learning activities are required for
both students and faculty. The Internet has shown a rapid and important growth in the extent of online
education. This has created a new paradigm for teaching and learning that is different from the traditional
classroom experience and also different from earlier technology-based attempts (Kearsley, 1998).
One of the most important online course components has proven to be interaction, especially
learner to learner interaction. Alexander C. lists the top ten ranking components of an optimal online
environment, giving peer interaction the first place. Kearsley (1998) also states that discussions among
learners are among the most important components. This is not surprising because one of the most
important factors in learning appears to be interaction among learners and interaction between instructor
and learners. No matter how learning takes place, interaction has always been of great importance so that
an effective learning can occur. Especially when instruction is given to learners learning at a distance, this
interaction component is of vital importance. Having the lack of social interaction, learners may feel alone
and helpless at times they need to get help from someone, especially from their peers taking same course as
in any traditional classrooms. Studies suggest that facilitators’ active interactions with students have
significant effects on the quality of online distance learning (Thomas, Caswell, Price & Petre, 1998).
Background
Research literature on interaction or interactivity has highlighted the importance of interaction
between learners and instructors a great deal so far. According to Moore (1989) an effective online class
should have three types of interaction: Learner-content, learner-instructor and learner-learner. Each type of
interaction plays a role in the entire educational process. Another interaction type that some researchers
imply is that the interface interaction. Interface refers to how the learner uses the computer interface to
access and participate in instruction and communicate with instructors and other learners. Effective learner-
interface interaction allows the learner to focus on learning and communication rather than how to access
instructional content and communicate with others (Lohr, 2000).
One of the vital elements in any learning system so that interaction can take place is the support
that programs offer to both students and instructors. Student support has been defined in variety of ways in
the distance education literature. Simpson (2002) defines support as “all activities beyond the production
and delivery of course materials that assist in the process of students in their studies.” According to
Carnwell & Harrington (2001) support can be defined with its components: Activities that enables students
to progress satisfactorily, strategies such as cognitive, affective, meta-cognitive and motivational, and
finally skills such as informing, advising, counseling, assessing, enabling and feeding back. Interaction is of
most importance especially when learners and instructors/facilitators are separated by time and space.
In any distance learning setting, including online learning, extra consideration is given to
interaction so that learning will be as efficient and effective as face-to-face learning. Some say, with the
help of instructional technology, interaction taking place in an online learning environment could even be
better than it is in a traditional setting. This of course depends on whether the programs have well designed
interaction mechanisms in their activities.
Information Management (IM) Associated Degree Program of Anadolu University is the first
completely online undergraduate level program in Turkey. It started in October 2001 and gave its first
graduates in June 2003. The program aims to help students gain the necessary skills to use required
business software effectively and efficiently, acquire the concepts and experience of the Information
Management in business, attain the collaborative working experience and institutional communication on
the Internet environment, and get hold of necessary experience for the enterprise and management of the
Internet environment.
89
The program offers various types of interaction via different technologies. Synchronous and
asynchronous online tools such as listserv, email and chat enable students to interact with each other,
instructors, and facilitators. This support is provided and directed by the “Academic Advisors”, or
facilitators. There are 55 facilitators primarily for providing instructional support to the students. Each
facilitator is considered as an expert in one course content. For each course there are 5-10 facilitators.
These facilitators mainly provide guidance to the students when they are working on their assignments,
answer their questions regarding the assignments and the topics, assess the assignment and inform the
students and the course coordinators of the results, try to solve their organizational and/or technical
problems, direct the students to the related support service and inform the service representatives about the
students’ problems, and have social interaction with the students.
In order for better interaction can take place in the program, it is important to know how
facilitators and learners think about the interaction. Learner mails to the program and requests
have shown that students are satisfied to some degree about the support provided. On the other
hand, it is as much important to know how facilitators perceive supported interaction facilities so
that it can be more efficient and effective. Studies on those attempts have shown that facilitators’
active interactions with students have significant effects on the quality of online distance learning
(Thomas, Caswell, Price & Petre, 1998).
This paper reveals the results of a study that examined the facilitators’ perceptions on
interaction (learner to learner, to facilitators, to content, and to interface as well as facilitators to
learners, to facilitators, to content, and to interface) during an online course called Information
Management Associate Degree Program at Anadolu University.
Methodology
A survey in this study has been selected as the data collection method to seek input from the
facilitators. The survey instrument included 24 items related to learner, facilitator, content, and interface
interaction. Three items for each interaction type examined the facilitators’ levels of agreement on the
interaction taking place. The items listed in Table 1. Table 2, on the other hand, shows the design of the
study.
Table 1: Items used to assess the facilitators’ satisfaction levels for the interaction
Items
1 Learners’ communication with us (facilitators) was satisfactory.
(Learner to Facilitator)
2 Learners did not hesitate to ask questions to us.
(Learner to Facilitator)
3 Learners communicated to us almost on any subject.
(Learner to Facilitator)
4 As a facilitator I did not hesitate communicating to the learners.
(Facilitator to Learner)
90
5 As a facilitator, I encouraged learners to ask questions.
(Facilitator to Learner)
6 As a facilitator my communication to learners was satisfactory.
(Facilitator to Learner)
7 Interaction among learners was satisfactory.
(Learner to Learner)
8 Learners did not hesitate to interact with each other on problems they encountered.
(Learner to Learner)
9 Learners interacted to each other on many other subjects other than content and
assignments. (Learner to Learner)
91
Table 2: Design of the Study
Interaction Type
Groups Facilitators Learners Content Interface
Learners Items 1,2,3 Items 7,8,9 Items 13,14,15 Items 16, 17, 18
Facilitators Items 10, 11, 12 Items 4, 5, 6 Items 19, 20, 21 Items 22, 23, 24
The survey instrument was designed as a 5-point Likert-scale ranging from 1=strongly disagree to
5=strongly agree. The 3.41 mean score identified as the expected level of satisfaction with the item while
other responses enables the facilitators to show higher or lower levels of satisfaction. The 3.41 mean
average was determined after identifying the critical level: 4 intervals/5 categories = 0.8.
The 55 facilitators have taken part in the study. Almost all of these facilitators were graduate
assistants at varying colleges of Anadolu University. A big majority of these facilitators (45.5%) were
majoring in the science fields like computer engineering, physic, and mathematics. Others were in the
social sciences. Only 8 (14.5%) of them were in the education field and 1 facilitator was in medical
sciences. Of the facilitators 11 (20%) were female (Table 3) and most of them (49%) were between 25-29
years old. Besides, majority of the participant facilitators (78.2%) have reported that they had good and
professional levels of computer experience while 12 (21.8 percent) indicated they had intermediate level
experience (Table 4). Moreover, only 13 (23. 6 percent) of the participants indicated that they were
experienced in teaching prior the program, while majority (58.2%) of them had prior experience by
assisting someone else either short term or for a whole semester (Table 5).
The study was conducted at the end of the spring 2003 semester (in June 2003). After distributing
the paper-pencil version of the instrument to the facilitators, the researchers allowed them to return in a
week. All facilitators responded the survey in the allocated time limit except three. Those late three were
given extra time and their data collected later on.
Table 3: Gender
Male 44 80.0
Female 11 20.0
TOTAL 55 100
Intermediate 12 21.8
Good 25 45.5
Professional 18 32.7
TOTAL 55 100
No experience 6 10.9
92
Short term assistance to a course 9 16.4
Experienced 13 23.6
TOTAL 55 100
The mean scores, standard deviations, t-tests and ANOVA analyses were used to interpret the data
gathered via the survey instrument. According to Cronbach’s Alpha analysis, the reliability of instrument
was overall found as 0.8769.
Interaction N M SD
Learner to Facilitator 55 3.23 1.06
Facilitator to Learner 55 4.04 .69
Learner to Learner 55 3.75 .71
Facilitator to Facilitator 55 3.46 .85
Learner to Content 55 3.76 .68
Learner to Interface 55 3.88 .58
Facilitator to Content 55 3.94 .59
Facilitator to Interface 55 4.27 .55
Do the facilitators’ characteristics (gender, computer experience, and teaching experience) have any
effect on their satisfaction?
The second question of the study examined the differences occur in the facilitators’ overall
satisfaction score for any of the interaction types due to their characteristics such as gender, computer and
teaching experiences. An independent sample t-test analysis has been conducted to see of gender makes
any difference in the facilitators’ satisfaction. The results of the analysis summarized in Table 7.
According to the results, the female facilitators scored higher than male counterparts overall.
However, only in “learner to content” (t=2.195, df=53, p=.03) and “facilitator to content” (t=2.406, df=53,
p=.02) interaction type the difference was significant. The female facilitators (Mf=4.15) found the “learner
93
to content” interaction more satisfactory than the males (M=3.67). Also, they found “facilitator to content”
interaction more satisfactory (M=4.30) than male facilitators (M=3.85). For other interaction types the
differences between females and males were not significant.
Sig.
Support Gender N M SD Df
(2-Tailed)
Learner to Facilitator Female 11 3.70 1.07 53 .104
Male 44 3.11 1.04
Is there any difference between how facilitators perceive their own interaction with learners, other
facilitators, content, and interface and how learners’ interaction with learners, content, facilitators, and
interface?
Findings were also analyzed to see if there is a difference between facilitators’ perception of
satisfaction between themselves and learners’ interactions on any of the types. Content, interface and in-
group interaction variables were analyzed to see whether facilitators perceive the interaction in these
differently. For this, paired sample t-test was conducted. Table 8 shows the results. According to the
results, there is a significant difference between facilitators’ perception of interaction with the learners
among themselves and learners with facilitators (p=.005). Facilitators believe that they better interact with
learners (M=4.04) than their peers (M=3.75). Also, there is a significant difference between facilitators’
perception of interaction between themselves and learners on the content (p=007). They believe that
facilitators’ interaction with the content is more satisfactory (M=3.94) than the learners’ (M=3.76). Another
94
significant difference between learners and facilitators is that the facilitators’ perception of interaction with
interface (p=.001). They think, they better interact with the interface (M=4.27) than the learners (M=3.88).
The last paired t-test analysis was conducted to see whether facilitators perceive their own interaction better
or more satisfactory than the learners with other learners. The difference was significant (p=.040). But
contrary of the other types, Facilitators perceive their own interactions less satisfactory (M=3.46) than
those of learners (M=3.75). Although this finding may seem surprising, it should be noted that the
facilitators need less help and guidance among themselves and thus communicate less comparing to the
learners.
Sig.
Pairs N M SD Df
(2-Tailed)
Learner to Learner 55 3.75 .71 54 .005*
Facilitator to Learner 4.04 .69
54
Conclusions
This descriptive study reveals that the facilitators in the Information Management (IM) Program of
Anadolu University are satisfied with the interaction taking place in the program overall. However,
findings reveal that facilitators do not seem to be satisfied with the learner-facilitator interaction. While
they scored higher on the “learner to learner interaction” items and believed that learners better interacted
with each other, they did not believe that the same applied to learners’ interaction with facilitators
themselves. This implies the possibility that the learners might be getting better support from their peers.
Also, it might be inferred that the learner-facilitator interaction mechanisms may be in need of slight
revisions. According to the findings, there are significant differences on some points between female and
male facilitators. Females perceived the interaction with content more satisfactory than males. This might
be caused by females’ preferences on content related issues rather than technology related issues. When
interaction types were paired, facilitators generally felt that they did better than learners in terms of
communication. Computer and teaching experience did not have any significant differences on facilitators’
perception of interaction in any types.
The results of this study can be more interesting when compared to students’ perception of
interactions on the same or similar issues. Further research on the qualitative sides of the issue may reveal
deeper perspectives on the interaction. There is no doubt that such research studies will provide better
interaction and support facilities to the IM Program of Anadolu University.
References
Alexander, C. Components of an optimal online environment. On-line. [Avaliable at]:
http://newmedia.colorado.edu/cscl/286.pdf
Carnwell, R. & Harrington, C. (2001). Diagnosing student support needs for distance learning. Paper
presented at the annual meeting of the Association for Institutional Research, June 3-6, 2001,
Long Beach, CA.
Kearsley, G. (1998). “Online education: new paradigms for learning and teaching.” Tehcnology Source,
August.
95
Lohr, L. L. (2000). Designing the instructional interface. Computers in Human Behavior 16(2), 161-182.
Moore, M. G. (1989). Three types of interaction. In M. G. Moore, MG. & Clark G. C. (Eds.). Readings in
principles of distance education (pp. 100-105). University Park, PA: Pennsylvania State
University.
Simpson, O. (2002). Supporting students in online, open and distance learning (2nd ed.). London: Kogan
Page
Thomas, P., Carswell, L., Price, B.A. & Petre, M. (1998). A holistic approach to supporting distance
learning using the Internet: transformation, not translation. The British Journal of Educational
Technology, 29(2), 149-161.
96
Faculty Perspectives at Anadolu University on the Use of Internet for
Educational Purposes
Hasan Cahskan
Nebiye Ozaydemir
Anadolu Univerisity
Eskisehir, Turkiye
Online learning has gained much importance in parallel to the development of Internet and web
technologies and their use in education. In an online learning environment, teaching-learning activities and
services are delivered to the learners with the support of computer networks. Researches show that when
designed carefully and systematically, online learning environments support effective, efficient and
appealing learning.
Instructor and learner roles in online learning differ from traditional learning environments.
Passive receivers in traditional learning are now active learners and take responsibility for their own
learning as they construct meanings in real life examples in online environments. This requires instructors
as well as learners to act differently and take new roles. Instructors are now facilitators, directing
discussions, coaching and advising learners other than being only content deliverers in classrooms. These
new roles stand up against instructors as new challenges they should adapt systematically. Also, taking
such new roles requires them to adapt new competencies and design considerations. Adapting to these new
roles and competencies is of much importance for instructors so that effective online learning can take
place.
Background
In an online learning environment, both the roles of instructors and learners have changed
comparing to traditional educational settings. Learners, instead of being passive receivers, are much more
active and take the responsibility of their own learning showing participatory behaviors in the learning
settings (Estep, 2003). This gives an important responsibility to the instructors (Calışkan, 2001). Instead of
being the only source of information in the classroom, instructors now direct debates and discussions in the
classrooms, provide resources to the learners, and provide guidance as they need. Such roles require
instructors to open up discussion subjects, promote participation, and direct discussions and activities so
that they could be on task.
Online learning has often been analyzed and researched. While some researchers focus on learners
in online environments, some others focus on instructors. Those who research on instructors point out a
wide range of research findings. While some reflect a general point of view, some others try to reflect on
some specific characteristics such as instructors’ efforts for online learning, their preferences, collaboration
and interaction. Additionally, instructors’ reasons for Internet use and their current competencies on the use
of some specific platforms in online learning have recently been researched at Anadolu University, Turkey.
Besides the general perspectives on Internet use, its educational use, peoples’ perspectives, and
instructors/learners readiness for online courses should be of much consideration.
This study examines the faculty perspectives at Anadolu University on the instructional use of
internet. Recently courses have begun to be offered online on the campus and naturally both instructors and
learners have encountered new situations in terms of teaching and learning. It is clear that leaving old
traditions and methods aside or trying to change them is not that easy especially when people are grown up
having accustomed to do things in some certain ways. This study examines the faculty perspectives at
Anadolu University on online learning. Problems in online courses offered, instructor roles, designing
teaching-learning activities, extending both undergraduate and graduate online courses are the different
dimensions of this study. Examining the faculty perspectives will enlighten the current and future problems
and their solutions.
97
2. For what reasons do the faculty use computer and Internet?
3. What experiences do the faculty have on using Internet for educational?
4. What are faculties’ preferences of education on online learning?
Methodology
The design of the study was based upon the general survey method. The data were gathered
through literature review and questionnaire techniques. A questionnaire, consisted of information about the
faculty and 17 five scale questions, was prepared to examine the perspectives of faculty on the use of
Internet for educational purposes. This questionnaire was applied to 190 faculty members at the 8 different
schools. Descriptive statistics, variance analysis and t-test were used in analyzing data.
Schools that participated to the study and academic status of the participants from these schools is
given in Table 1 and Table 2 respectively.
Professor 23 12.1
Lecturer 35 18.4
98
Results and Discussions
How do the faculty evaluate their computer proficiency and their Internet access facilities?
The first research question was about how the participants evaluate themselves on computer
proficiency and Internet use. Table 3 summarizes the faculty evaluations of themselves on computer use.
According to the results, 50% of the faculty evaluated themselves as advance users. Findings show that
among the faculty, assistant professors are more capable of using computers effectively. Majority of the
faculty owns computer and have access to Internet at their homes (60%), the ratio of those who have
computers at their homes is 80%.
Computer Proficiency
99
Table 5: The Reasons Faculty Use Computers and Internet
Communication f - 3 37 72 78 190
% - 1.6 19.5 37.9 41.1 190
Instruction f 7 10 41 81 51 190
% 3.7 5.3 21.6 42.6 26.8 190
Research f 1 2 25 81 81 190
% 0.5 1.1 13.2 42.6 42.6 190
Entertainment f 55 72 45 12 5 190
% 28.9 37.9 23.7 6.3 2.6 190
100
- 48 25.3
Supporting others’ Did in the past 12 6.3
online classes Still doing 20 10.5
Did in the past and still 24 12.6
doing
I will do it in the future 81 42.6
- 90 47.4
I did some classes Did in the past 10 6.3
completely online Still doing 7 3.7
Did in the past and still 14 7.4
doing
I will do it in the future 76 40
- 102 53.7
Producing content for Did in the past 13 6.8
the use of Internet for Still doing 17 8.9
educational purposes Did in the past and still 21 11.1
doing
I will do it in the future 89 46.8
- 94 49.5
Preferences Total
Types of Education First place Second place Third place
f % f % f % f %
Face to face 54 28.4 41 21.6 27 14.2 122 64.2
Internet 28 14.7 27 14.2 47 24.7 102 53.7
Both 94 49.5 51 26.8 18 9.5 163 85.8
Support desk 8 4.2 46 24.2 39 20.5 93 48.9
Textbook/Cd 4 2.1 19 10 52 27.4 75 39.5
Conclusions
This descriptive study reveals that faculty use computers and Internet and they perceive
themselves as advanced users. Majority of them have access to Internet. But on the other hand,
they have low expectations for online courses. Those who have experience on the use of Internet
for educational purposes hesitate to do such activities in the future. This must be taken into
consideration prior to implementing such courses. In order for doing this they also should be
supported with the necessary infrastructure and financially. Another point is that faculty’s teaching
and learning habits should be reviewed and they should be presented with teamwork related
activities in their everyday teaching/learning activities since online teaching activities require
people to work with others collaboratively. Faculty can be taken into plan and design processes
and thus they can get accustomed to the environment and its requirements. On the other hand,
further research should be done on the learners’ perspectives related to online learning so that
101
better teaching learning activities can be planned and conducted.
References
Allen, I.E. & Seman, J. (2003) Sizing the opportunity: The quality and extent of Online Education in the
USA, 2002 and 2003. Needham: Sloan-C.
Çalışkan, H. (2001). Online learning and cooperative learning teams. Kurgu: Anadolu University, School
of Communication Periodical, 18.
Estep, M. (2003). Preparing prospective students for success in online learning environments. Journal of
Instruction Delivery Systems, 17 (4).
Küçük, M. (2002). Faculty use of Internet for research purposes. Unpublished Master Thesis. Social
Sciences Institute, Anadolu University.
Ulukan, C & Çekerol, K. (2003). New instructional media: Faculty attitudes toward Internet based
instruction at Anadolu University. Paper Presented at 3rd International Educational Technology
Symposium. Eastern Mediterranian University, Cyprus.
Williams, P. (2003). Role and competencies for distance education programs in higher education programs.
The American Journal of Distance Education, 17(1).
102
Supporting the Introduction of New Testament Interpretation Methods
in a Survey Course with a Constructivist Learning Environment - An
Exploratory Study
Dan Cernusca
David Jonassen
Gelu Ioan Ionas
University of Missouri Columbia
Steve Friesen
University of Texas at Austin
Abstract
The purpose of the study was to explore how the implementation of the hypertext-learning
environment affected students’ ability to extract the main points of gospel texts. A quasi-experimental two-
way factorial ANOVA with repeated measures design was used to analyze the impact of the prior
knowledge and major on student’s performance over time. The results indicate a significant negative
impact of the instructional treatment over time for students with high entry-level scores and a ceiling effect
of the treatment for students in the social-studies group.
Introduction
Religions are extremely complex phenomena and consequently the study of religion deals with
different individual, group and cultural beliefs systems. Traditionally, New Testament courses expose
students to one method of interpretation, historical criticism, for its focus on “objectivity in discovering and
reporting what really happened in the past” (Miller, 1993, p.12). Contemporary research has supplemented
this methodological approach with insights offered by methods coming, among others, from the field of
literature (e.g. Gunn, 1993) and the feminist theory of justice (e.g. Schussler Fiorenza, 2001). This
development created challenging critical debate topics for biblical scholars and an even more challenging
task of exposing undergraduate students to a significant sample of critical debate in the field of New
Testament.
Technological advances during early 1990’s generated several streams of research focusing on the
evident gap between the structured and explicit learning that takes place in classroom and learning as part
of doing real-world activities, and how educational technology can address this gap. Brown, Collins and
Duguid (1989) analyzed the main characteristics of activity-driven learning and knowing and proposed a
socio-cultural theoretical framework known as Situated Cognition or Situated Learning. While engaged in
the context of the activity students learn to use culture-specific tools that mediate their immersion in the
culture of that community. Following the situated learning approach, a hypertext-based learning
environment called Cinema Hermeneutica was developed and implemented in a large survey Intro to New
Testament class. Four jobs associated with four biblical criticism methods were modeled in the
environment. Each student had to choose one method/job and engage in a series of tasks that started with
applications of specific criticism steps to current-day materials (movies, music, newspapers) and ended
with application of the method to gospel texts.
103
Rationale for the Study
Religious studies field is continuously searching for methods and tools to address the complexity
of religion education. Cognitive science is one of the fields that informed lately the religious education
(Brelsford, 2005) through theoretical support such as social developmental perspective (Riley Estep, 2002),
cognitive complexity (Box Price, 2004), or post-modernity epistemic paradigm (Martin & Martinez de
Pison, 2005). This research has as focus an additional step in this endeavor that is the analysis of the impact
of online constructivist learning environments on students’ learning of religious criticism methods.
Participants
The research team discovered a large heterogeneous student body of about 150 students, most of
them freshmen and sophomores, coming from diverse backgrounds: humanistic, art, engineering, business,
104
natural, health and social sciences. This is a typical population for a survey class having a topic of general
interest and targeting a large spectrum of students. We used a convenience sample resulted through
voluntary participation in the study. The incentive to participate was the possibility to obtain extra points
for the completion of two short essays. A number of 105 students answering both essays were retained for
this study.
Out of the 105 cases, during the data screening process one case identified as univariate outlier
was dropped from the analysis. After dropping the univariate outlier, the analysis of multivariate outliers
using Mahalanobis distance (Tabachnick & Fidell, 2001, p.93) indicated one multivariate outlier (Chi
Square max = 56.27 < 32.91 = Critical Value of Chi Square), leaving 103 cases for the final analysis.
The student body was relatively balanced with respect of gender distribution (female 56% and
male 44%) but quite unbalanced with respect of age (82% of the students being 20 years old or younger)
and year in school (80% of students being freshman or sophomore).
Procedure
The analysis of the gospel texts covers about first third of the course and has as main goal to
scaffold the main cognitive skills needed in academic interpretation of biblical texts with a long term goal
of preparing students for more complex, real-life ill structured problem solving they will face both in
academic and professional life.
To operationalize students’ prior knowledge the research team administered a set of pretest-
posttest essays outside the course activity. Students’ participation was voluntary and rewarded with extra
points counting toward their final grade. Because gospel analysis is the entry point in the course, the
research team measured the entry -level skills during the first week of the class students’ by administering a
gospel passage and asking students to indicate the main point of the passage in a short essay. Starting with
the second week of the course, students fully engaged in one method of biblical criticism using Camp
Heremenutica following the two scaffolding stages as described in the presentation of the environment
above. The culminant activity within the learning environment was to submit a formal essay associated
with a gospel text interpretation using the chosen perspective. A mid-term examination that included the
interpretation of a gospel passage similar to the final task in Camp Hermeneutica concluded the gospel part
of the course. In the week following the mid-term exam, the research team administered a second short
gospel passage similar in length and complexity to the pretest one to determine the posttest scores.
Task Please write 50-100 words on what you think is the main idea of the following story. I am not
looking for a right or wrong answer. I just want some examples of how students interpret texts at
the beginning of the course. You will receive 5 points for your efforts. Here's the story:
Text The land of a rich man produced abundantly. And he thought to himself, ‘What should I do, for I
have no place to store my crops?’ Then he said, ‘I will do this: I will pull down my barns and build
larger ones, and there I will store all my grain and my goods. And I will say to my soul, Soul, you
have ample goods laid up for many years; relax, eat, drink, be merry.’ But God said to him, ‘You
fool! This very night your life is being demanded of you. And the things you have prepared, whose
will they be?’
Demographic Data
Gender: 1-Female; 2-Male Age: Major:
You currently are: 1-Freshman; 2-Sophomore; 3-Junior; 4-Senior; 5-Post-baccalaureate; 6-Graduate; 7-
Other
The pretreatment and post-treatment essays were similar in complexity and length, and scored for
content (max. 6 points) and argument quality (max. 5 points) with two customized scoring rubrics. The
105
composed essay scores varied within a maximum range of 2 to 11.
To address scoring reliability, the research team placed students’ essays in random order and
deleted the identification information so that the graders had no link between essay and respondent. When
scoring the posttest essays, five randomly selected pretest essays inserted in the grading pool were re-
scored to analyze the variance between the first and the second round of scoring. The analysis indicated no
significant difference between first round and second round of scoring
Analysis
Prior knowledge was first categorical variable considered for this exploratory study. Three levels
of prior knowledge resulted from the composed pretreatment scores by considering a spread of one
standard deviation from the mean for medium prior knowledge level, and the two remaining tails of the
distribution as the low and respectively the high prior knowledge level. For the student sample that
participated in this study the three prior knowledge levels were relatively well balanced with 30% Low
(composed scores less than 5), 44 % Medium (composed scores from 5 to 7), and 26% High (composed
scores of 7 and higher) prior knowledge.
For major, the second categorical variable used in this study, the participants indicated about 25
different types of majors. They were recoded for the purpose of this study in four generic categories of
which 16% were soft-sciences (e.g. humanities, art, and fine art), 36% social-sciences (e.g. including
education), 35 % hard-sciences (e.g. engineering, natural sciences, business, and finance), and 13 %
undecided.
A series of two-way factorial ANOVA with repeated measures and two quasi-experimental
between-groups factors was used to analyze the impact of the treatment on students’ achievement in
interpreting gospel texts. More specific, the study focused on the overall impact of the treatment was well
as on the specific impact of prior knowledge (3 levels) and respectively major (4 levels) on students’ essay
scores from pretest to posttest as result of the use of Cinema Hermeneutica online learning environment in
conjunction with the regular class-based instruction.
Independence of observations was presumed that is the subjects’ response to the posttest essay
was not influenced by their answers to the pretest essay. Kolmogorov-Smirnov Test for pretest (K-S
Z=0.87, p > 0.4) and posttest scores (K-S Z=0.71, p > 0.7), as well as the P-P plots diagrams for these two
variables indicated that normality is a robust assumption for this data set.
Research Results
Prior Knowledge
Results were analyzed using a two-way ANOVA with repeated measures on one factor. The Prior
Knowledge X Time interaction was significant, F(2,100)=35.46, p < .001 (see Table 2).
The graphical representation of the estimated marginal means across time for the three prior
knowledge groups suggested that students with low and medium prior knowledge levels benefited from the
treatment while students with high prior knowledge levels did not benefited from the treatment (See Figure
4).
106
Tests for simple effects showed that all three prior knowledge groups displayed a significant
difference across time. The low and medium prior knowledge displayed a significant increase across time,
F(1,30)=60.54, p < .001, respectively F(1,44)=8.25, p < .05. On the other hand, the high prior knowledge
group displayed a significant decrease from pretest to posttest, F(1,26)=16.81, p < .001. Post-hoc contrast
showed that at posttest, the three groups do not significantly differ among them in terms of essay scores,
F(2,100)=.58, p > .55.
Prior knowledge
9
low (< 5)
8
Estimated marginal means
medium ( 5 - 7)
high (> 7)
7
3
pretest posttest
Time
Figure 4. Significant TIME x PRIOR KNOWLEDGE interaction for the analysis of short-essay scores
Major
Results were analyzed using a two-way ANOVA with repeated measures on one factor. The Major
X Time interaction was significant, F(3,99)=4.92, p < .01 (see Table 3).
The graphical representation of the estimated marginal means across time for the four major
groups suggested that students coming form soft-sciences (e.g. humanities, arts, fine arts) and hard-sciences
(e.g. engineering, natural sciences, business, and finance) benefited from the treatment. On the other hand,
students coming from social-sciences (e.g. including education) and respectively undecided students faced
a ceiling effect or slight decrease (See Figure 5).
107
Major
7.5
soft sciences
5.5
5
pretest posttest
Time
Figure 5. Significant TIME x MAJOR interaction obtained in the analysis of short-essay scores
Tests for simple effects showed that two out of four major groups displayed a significant
difference across time. Soft-sciences group (e.g. humanities, arts, fine arts) and hard-sciences group (e.g.
engineering, natural sciences, business, and finance) displayed a significant increase across time,
F(1,16)=39.72, p < .01, respectively F(1,35)=16.53, p < .01. Social-sciences group (e.g. including
education) and respectively undecided group displayed no significant difference across time, F(1,36)=.006,
p > .94, respectively F(1,12)=.11, p > .74. Post-hoc contrast showed that at posttest, the four groups do not
significantly differ among them in terms of essay scores, F(3,99)=1.93, p > .13.
108
large range of students’ individual needs. The findings from this exploratory research suggest some
potential factors to consider in the design and redesign of the structure and content of activities for the
online learning environment developed to sustain biblical criticism methods in a large survey course.
References
Box Price, E. (2004). Cognitive Complexity and the Learning Congregation. Religious Education, 99(4),
358-370).
Brelsford, T. (2005). Lessons for Religious Education from Cognitive Science of Religion. Religious
Education, 100(2), 174-191.
Brown, J.S., Collins, A., & Duguid, P. (1989). Situated Cognition and the Culture of Learning. Educational
Researcher, Jan-Feb, 32-42.
Gunn, D.M. (1993). Narrative Criticism. In: S.R. Haynes & S.L. McKenzie (Eds.). To Each its Own
Meaning. An Introduction to Biblical Criticism and Their Application, (171-196). Louisville, KY:
Westminster/John Knox Press.
Jacobson, M. J. & Spiro, R. J. (1995). Hypertext learning environments, cognitive flexibility, and the
transfer of complex knowledge: An empirical investigation. Journal of Educational Computing
Research, 12(4), 301-333.
Jonassen, D.H. & Grabowski, B.L. (1993). Handbook of Individual Differences. Learning &Instruction.
Hillsdale, NJ: Lawrence Erlbaum Associates, Publishers.
Martin, M.K., & Martinez de Pison, R. (2005). From knowledge to wisdom: A new challenge to the
educational milieu with implications for religious education. Religious Education, 100(2), 157-
173.
Miller, M.J. (1993). Reading the Bible Historically: The Historian’s Approach. In: S.R. Haynes & S.L.
McKenzie (Eds.). To Each its Own Meaning. An Introduction to Biblical Criticism and Their
Application, (11-28). Louisville, KY: Westminster/John Knox Press.
Schussler Fiorenza, E. (2001). Wisdom Ways. Introducing Feminist Biblical Interpretation. Maryknoll, NY:
Orbis Books.
Shapiro, A. & Niederhauser, D. (2004). Learning from Hypertext: Research Issues and Findings. In: D.H.
Jonassen (Ed.). Handbook of Research on Educational Communications and Technology (605-
620). Mahwah, NJ: Lawrence Erlbaum Associates, Publishers.
Soulen, R.N., & Soulen, K.R. (2001). Handbook of Biblical Criticism. Third Edition. Louisville, KY:
Westminster John Knox Press.
Spiro, R. J. & Jehng, J.-C. (1990). Cognitive flexibility and hypertext: Theory and technology for the
nonlinear and multidimensional traversal of complex subject matter. In: Nix, D. & Spiro, R. J.
(Eds). Cognition, education, and multimedia: Exploring ideas in high technology. (pp. 163-205).
Hillsdale, NJ: Lawrence Erlbaum Associates, Inc.
Riley Estep, Jr., J. (2002) Spiritual formation as social: Toward a Vygotskyan developmental perspective.
Religious Education, 97(1), 141-164.
Tabachnick, B.G. & Fidell, L.S. (2001). Using Multivariate Statistics. Fourth Edition. Needham Heights,
MA: Allyn and Bacon.
109
Figure 1. Cinema Hermeneutica: job selection screen for those students that completed the counseling step
Figure 3. Cinema Hermeneutica: non-gospel activities screen with the final assignment for first scaffolding
stage
110
Promoting Theory Application through Cognitive Structuring in Online
Learning
Shujen L. Chang
University of Houston-Clear Lake
ABSTRACT
This study proposed an easy-to-implement instructional strategy, cognitive structuring
assignments (CSA), for developing students’ ability of learning theory application in an online learning
environment. The effect of CSA on learning theory application was empirically investigated and found that
the CSA group significantly outperformed the control group in terms of applying learning theory into
various settings. This finding suggests that CSA may be an effective instructional strategy and online
instructional designers and instructors may incorporate more CSA to advance students’ ability in applying
learning theory into teaching and working situations.
Introduction
Learning theory application has been a focus of learning and instruction especially in teacher
education (Driscoll 2005; Gagne & Medsker, 1996). The common instructional strategies that enabled
cognitive structuring for theory application were concept mapping, semantic relationship tests, and worked
examples (DeSimone et al., 2001; Jonassen & Wang, 1993; Ward & Sweller, 1990). It was found that,
although there were various advantages associated with these instructional strategies, several inherent
weaknesses of these instructional strategies made it difficult to implement these instructional strategies in
online learning environment. Thus, this study proposed an instructional strategy - cognitive structuring
assignment (CSA), which was built upon these common instructional strategies and their advantages but
meanwhile alleviated their weaknesses, to facilitate learning theory application in the online learning
environment.
Purpose
The purpose of this study was to investigate the effect of cognitive structuring assignments (CSA)
as an instructional strategy in learning theory application in an online learning environment. It was
hypothesized that the use of CSA would facilitate the processes of cognitive restructuring for achieving
better learning theory application.
Method
Research participants were 104 graduate students, 88 females and 16 males, of a southwest public
university. The course was delivered entirely online with a required textbook. Online courses material
included overview, outline, activities, assignments, and quizzes. Bulletin board and email were used for
class discussion and telecommunication through the course website.
This study used a two-group quasi-experimental design. The control and treatment groups
consisted of 58 and 46 participants, respectively. The control group received no CSA and the treatment
group received two CSAs, one at the beginning and the other one at the end of semester.
THE INSTRUMENT AND VARIABLES
The instrument, CSA, was developed on the foundation of concept mapping, semantic relationship
tests, and worked examples, and aimed to assess a student’s cognitive structure of (after studying) learning
theories (DeSimone et al., 2001; Jonassen & Wang, 1993; Ward & Sweller, 1990). Students were asked to
describe the relationship between a given learning theory and a case of its application. Alpha Cronbach
tests for the reliability coefficient on the two CSAs were .72 and .67.
The two dependent variables observed in this study were the cognitive restructuring and learning
theory application. Cognitive restructuring referred to students’ improvement from the 1st to the 2nd CSA.
CSA was assessed on two criteria: 1) clear and accurate identification of the relationship, and 2) clear and
logical explanation of the identified relationship. Learning theory application referred to students’
performance on the application of learning theories from which students designed instructional strategies. It
was assessed on two criteria: 1) appropriate instructional methods for the given learning theory, and 2)
explicit description of the relationships between the instructional methods and the given learning theory.
111
Data Analysis
Nonparametric data analyses were employed because intact classes, or non-random treatment
assignment, were used in this study. The significance level of .05 was used for all statistical tests. Learners’
age, gender, prior experience in online courses, and English as native language were not significantly
correlated with any dependant variables and were dropped in subsequent analyses.
Results
Overall, the means of cognitive restructuring and learning theory application were 20.92
(SD=31.96) and 84.30 (SD=11.31) respectively on a 100-point scale. The control group consisted of 58
students, 51 females and 7 males, while the treatment (CSA) group consisted of 46 students, 37 females
and 9 males.
A Wilcoxon Signed Ranks test for two related samples showed that cognitive restructuring was
significantly correlated to learning theory application (Z=-5.90, p = .00, mean rank=24.00, sum of
ranks=1080.00). Further, a Mann-Whitney test for two independent samples indicated that the CSA group
significantly outperformed the control group on learning theory application (Mann-Whitney U= 631.50,
Z=-4.61, p = .00).
References
DeSimone, C., Schmid, R., & McEwen, L. (2001). Supporting the learning process with collaborative
concept mapping using computer-based communication tools and processes. Educational
Research and Evaluation, 7(2-3), 263-283.
Driscoll, M. P. (2005). Psychology of Learning for Instruction (3rd. ed.). Boston: Allyn & Bacon.
Gagne, R. M., & Medsker, K. L. (1996). The conditions of learning: Training applications. Fort Worth:
Harcourt Brace College.
Hair, J. F., Anderson, R. E., Tatham, R. L., & Black, W. C. (1998). Multivariate Data Analysis (5th ed.).
New Jersey: Prentice-Hall.
Jonassen, D. H., & Wang, S. (1993). Acquiring Structural Knowledge from Semantically Structured
Hypertext. Journal of Computer-Based Instruction, 20(1), 1-8.
Ward, M., & Sweller, J. (1990). Structuring effective worked examples. Cognition and Instruction, 7(1), 1-
39.
112
How Can We Facilitate Students’ In-depth Thinking and Interaction In
An Asynchronous Online Discussion Environment? A Case Study
Wing Sum Cheung
Nanyang Technological University
Khe Foon Hew
Indiana University
Abstract
In the last few years, there has been a proliferation of asynchronous online discussion forums,
which have opened up the possibilities for people to exchange ideas at any place and time. The literature
has documented that asynchronous online discussion has the following desirable characteristics: 1) it may
help enhance the participation of student who might be less willing to participate in traditional face-to-face
classroom settings due to shyness; and 2) it has the potential to encourage more thoughtful responses since
participants can take their own time in composing their thoughts. Nonetheless, despite the promise of
asynchronous online discussion to promote rich learning experiences, students do not always make use of
its potential. The purpose of this study is to examine how a group of Singapore students interacted with one
another, as well as the types of thinking skills (critical or creative thinking) and levels of information
processing they exhibited during an asynchronous online discussion, in an attempt to draw out certain
guidelines that could help facilitate students’ in-depth thinking and promote student-to-student interaction
in an online discussion environment.
Introduction
Asynchronous online discussion generally refers to the exchange of messages via computer
networks where participants need not be online simultaneously. Since asynchronous online discussion
allows records of a participant’s written messages to be kept in the virtual electronic ‘space’ for long
periods of time (Ganeva, 1999), participants can respond to the messages posted at any time they prefer and
view the messages many times and long after the messages have been posted. In this way, asynchronous
online discussion can resemble written communication (Ganeva, 1999). Most of the current asynchronous
online discussion forums are hypertext-based, which make them dynamic environments, i.e., users can
manipulate the display of the content of the conference, and view the record of messages in sequenced or
‘threaded’ formats (sorted according to time of contribution, grouped by author, or clustered according to
topical links) (Ganeva, 1999). Asynchronous online discussion has the potential to improve the teaching
and learning experiences in traditional classroom settings. As Groeling (1999, p. 1) wrote, “With it,
scholars and educators have the potential to vastly expand the opportunities for students to interact outside
the classroom.” In brief, the literature has argued that asynchronous online discussion has the following
desirable characteristics (Groeling, 1999): 1) Asynchronous online discussion increases accessibility and
opportunities for interaction since it is available 24 hours a day and 7 days a week; 2) Asynchronous online
discussion can help enhance the participation of student who might be less willing to participate in
traditional face-to-face classroom settings due to shyness, language problems or gender; and 3)
Asynchronous online discussion has the potential to encourage more thoughtful and reflective responses
since participants can take their own time ordering and composing their thoughts.
Nonetheless, despite the promise of asynchronous online discussion to promote rich learning
experiences, students do not always make use of its potential (Fishman & Gomez, 1997; Guzdial, 1997). As
noted by Land & Dornisch (2002, p. 366): “Only a few students may contribute, discussions may be
controlled by a small number of authors, and interactive dialogue in which students present and respond to
views may be absent. Such problems might be related to student focus on completion of a task”.
Additionally, students may remain “lurkers” in asynchronous online discussion environments, due to the
lack of critical and creative thinking skills to make interesting postings. Therefore, in order to foster
productive dialogue, several methods have been suggested in the literature. These include the role of the
online moderator, the role of the participants, and the types of activity used during the online discussion
(Ahern, Peck & Laycock, 1992; Bodzin & Park, 2000; Rohfeld & Hiemstra, 1995). We argue that, besides
the aforementioned methods, certain practical guidelines could help facilitate students’ in-depth thinking
and promote student-to-student interaction in an asynchronous online discussion environment.
113
The purpose of this study is to examine how a group of Singapore students interacted with one
another, as well as the types of thinking skills (critical or creative thinking) and levels of information
processing they exhibited during an asynchronous online discussion, in an attempt to draw out certain
guidelines that could help facilitate students’ in-depth thinking and promote student-to-student interaction
in an online discussion environment.
Methodology
Participants
Thirty-eight student teachers (henceforth referred to as students) in Singapore participated in the
study. The participants were enrolled in a hypermedia design and development course at the National
Institute of Education (NIE), Singapore. The participants were enrolled in a hypermedia design course
where they were required to design and develop instructional hypermedia materials to be used in actual
classroom settings. During the course, two asynchronous online discussion sessions were held; each lasted
four weeks. These discussions were done using BlackBoard, a Web-based course management software.
The overall objectives of the online discussions were: 1) to provide each student an opportunity to identify
design problems of their classmates’ hypermedia materials and give suggestions to solve the problems; and
2) to allow students to comment about their classmates’ ideas and suggestions.
Instrumentation
An interaction and thinking framework was developed to analyze the computer transcripts of the
students’ discussions. We defined interaction as the extent of information exchange among students. The
exchange of information can be conceptualized as having two levels: true interaction and quasi-interaction.
True interaction involves a three-step process: Person A communicates with B; a response from B; and
person A’s reply to B’s response. Quasi-interaction involves only two actions: person A communicates
with person B and a response from person B.
We also studied two types of thinking skills: critical thinking and creative thinking. Critical
thinking is the ability to assess the reasonableness of ideas (Swartz & Parks, 1994), while creative thinking
is the skill to generate ideas or solutions (Jonassen, 1997). Next, we differentiate between ‘surface’ and ‘in-
depth’ levels of information processing in order to analyze the level at which these thinking skills occur.
The levels of information processing in this study were drawn from Henri (1992b), Swartz & Parks (1994),
Newman, Johnson, Webb, & Cochrane (1997). Elements of ‘surface’ information processing for creative
thinking would include: 1) proposing ideas or solutions without offering any explanations, 2) merely
repeating what has been said, and 3) squashing new ideas, while ‘in-depth’ level of creative thinking refers
to: 1) proposing solutions with clear explanations, 2) adding new information to previous comments, and 3)
generating novel ideas or solutions.
To evaluate the level of critical thinking skills, we used the following ‘surface’ and ‘in-depth’
levels of information processing. ‘Surface’ level includes: 1) making conclusions or judgments without
offering justification, 2) sticking to prejudices or assumptions, 3) stating that one shares the conclusions or
judgments made by others without taking these further, and 4) failure to state the advantages or
disadvantages of a suggestion, conclusion or judgment. “In-depth’ level, on the other hand, involves: 1)
making conclusions or judgments supported by justification, 2) setting out the advantages or disadvantages
of a suggestion, conclusion or judgment, 3) stating that one shares the conclusions or judgments made by
others and supporting them with relevant facts, experience or personal comments, and 4) making valid
assumptions based on the available indicators. Table 1 provides a summary of the definitions and indicators
of our framework.
Students were also asked to keep reflection logs to describe their feelings, experiences and things
they had learnt during the asynchronous online discussion. In addition, focus group interviews were
conducted to elicit in-depth information about students’ perceptions and experiences in using asynchronous
online discussion.
114
Creative thinking Generate possible Surface:
problem solutions (*J) - Suggestions proposed are unclear and with little or no details
or examples.
- Repeating someone else’s suggestions without adding in new
information or personal comments. (*N)
- Squashing, putting down new suggestions. (*N)
In-depth:
+ Suggestions proposed are clear and supported with
appropriate details or relevant examples.
+ Adding on to someone else’s suggestions with new
information, personal experiences or relevant literature. (*N)
+ Generating unusual suggestions that nobody has thought of.
(*SP, N)
Table 1. Framework for evaluating thinking skills and levels of information processing
* Adapted from the following sources: J – Jonassen (1997), H – Henri (1992b), SP - Swartz & Parks
(1994), N - Newman et al (1997)
115
suggestion by Lincoln and Guba (1985) that the unit of analysis be heuristic and able to stand by itself. This
selection is also consistent with Merriam’s (1998) recommendation that “communication of meaning” (p.
160) be the main focus. Therefore, in this study, each message paragraph was analyzed and if two
continuous paragraphs dealt with the same idea, they were counted as one message unit. If, on the other
hand, one paragraph contained two ideas, it was counted as two separate message units. Once the message
units were identified, the analysis then moved into the second part where the interaction and thinking model
was used to identify the various types of interactions, thinking skills and levels of information processing
evident in the students’ online transcripts logs.
116
What types of thinking skills (critical or creative thinking) and levels of information processing did the
students exhibit during the online discussion?
Table 2 shows the frequency count of the different types of thinking skills and levels of
information processing. Examples from the data for the different types of thinking skills and levels of
information processing are summarized in Table 3.
Category of thinking skills Frequency count
Creative thinking 73
(Surface level: 34
In-depth level: 39)
Creative thinking – surface • “More graphics should be added to make it more interesting. You
may have thought of adding more graphics and media to it.”
(The above statement was classified as creative thinking - surface
level because the author posed a solution, i.e. adding more
graphics, but stopped short of offering any suggestion or
elaboration as to the type of graphics to be used.)
Creative thinking – in-depth • “I noticed that there are no buttons to allow learners to move from
slides to slides. Just a suggestion…you might want to include a
panel for these buttons. For example, a ‘help’ button is essential
since this is a difficult topic.”
(The above statement was categorized as creative thinking – in-
depth because the suggestion given is clear and supported with
an appropriate example.)
Critical thinking – surface • “ I find that there are too many empty (white) spaces on the
presentation slides.”
(This was classified as critical thinking - surface level of
information processing since the author made his conclusion
without giving any justification as to why it was not good to have
too many empty spaces on a presentation slide.)
Critical thinking – in-depth • “I feel that the choice of your illustrations are quite well chosen,
except for the birds. I feel that the birds are distracting because of
their movements and they don’t blend well with the other
illustrations.”
(The above statement was coded as critical thinking – in-depth
level of information processing because the author expressed a
judgement and provided a plausible argument as to why his
judgement was valid.)
Table 3. Examples of thinking skills and the varied levels of information processing
From Table 2, it can be seen that critical thinking formed the bulk of the students’ thinking skills.
This result reflects the way that the students tended to use the asynchronous online discussion forum to
117
judge the quality of their classmates’ hypermedia projects, rather than suggest possible solutions to help
improve the projects. Overall, 44.4% of all the thinking skills exhibited by the students were of surface
level information processing. Most of the surface level thinking was due to the fact that the students failed
to justify their judgments or comments, or proposing a solution with little details or explanations. Students
appeared to regard knowing “how to do” as more important than knowing “why they are doing it”, a
finding that supports Lim and Tan (2001) conclusion that student teachers in Singapore tended to prefer
instrumental understanding: knowing what to do without knowing the reasons. The message ideas in this
study, as the discussion progressed, also began to “sound along the same lines”. There was not much new
insight or “new twist” in the students’ responses.
In summary, the analysis of the students’ asynchronous online transcripts has revealed the
following problems of thinking and interaction: Difficulty in keeping track of discussion threads,
procrastination or failure in responding to the message postings, failure in justifying the conclusions or
judgments made, proposing solutions with no details or explanations, and repeating ideas that have been
previously made without giving additional information or insight.
Implications
To alleviate the aforementioned problems of thinking and interaction, we propose the following
seven guidelines. With regards to thinking, we first propose that students be reminded not to merely repeat
a previously mentioned idea in the online discussion. Additional information (such as insights based on
personal experiences or knowledge) is to be provided. Second, students are to justify all conclusions or
judgments made. By doing so, the students will progressively deepened their understanding as they reflect
on their statements and refine their initial conceptions. Third, students are to clarify all suggestions made
with the appropriate details or relevant examples, when others query them. This will make their suggestions
easily understood by others. And fourth, students are encouraged to use good questioning techniques, for
example a taxonomy of Socratic questions, to help them generate critical and creative postings. According
to Thoms and Junaid (1997, p. 2), these questions include:
1. Questions of clarification. These ask for verification or additional information of one point or main
idea.
2. Questions that probe assumptions. These questions ask the student for explanation or reliability of an
assumption.
3. Questions that probe reasons and evidence. This category of questions ask for additional examples,
reasons for making statements or process that lead the student to his or her belief.
4. Questions about viewpoints. These ask the student whether there are alternatives to his viewpoint or a
comparison of similarities and differences between viewpoints.
5. Questions that probe implications and consequences. Finally, this category of questions helps the
student to describe the implication of what is being done, or the cause-and-effect of an action.
With regard to the facilitation of interaction, we propose the following rules. First, students have
to exercise caution when replying to any message in the online discussion. They have to make sure that
they reply to the correct thread in order to avoid any disorientation in the discussion. Second, students are
reminded to put forth only one idea in one message posting. This would give the online participants a clear
view of all the ideas under discussion and avoid responding to the wrong thread or idea. And third, students
are encouraged to reply to their classmates’ online enquiries within 48 hours, so as to avoid the problem of
delay between message postings. We summarize these seven rules in Table 4.
118
Thinking Interaction
1) Do not simply repeat someone’s ideas in the 1) Please reply to the appropriate thread.
online discussion. Please provide additional 2) Put only idea in one message posting. Avoid
information or insight (e.g. based on personal putting forth more than one idea in a message
experiences). posting.
2) Please justify all conclusions or judgments 3) Do not procrastinate in replying to someone’s
made. enquires. Please reply within 48 hours.
3) Clarify all suggestions made with the
appropriate details or relevant examples.
4) Use a taxonomy of Socratic questions to help
generate critical and creative postings. Such
questions include:
• Questions of clarification
• Questions that probe assumptions
• Questions that probe reasons and evidence
• Questions about viewpoints
• Questions that probe implications and
consequences
Table 4. Guidelines to facilitate students’ in-depth thinking and interaction in an asynchronous online
discussion environment
References
Ahern, T.C., Peck, K., & Laycock, M. (1992). The effects of teacher discourse in computer-mediated
discussion. Journal of Educational Computing Research, 8 (3), 291-309.
Bodzin, A. & Park, J. (2000). Factors that influence asynchronous discourse with preservice teachers on a
public, web-based forum. Journal of computing in teacher education, 16(4), 22-30.
Fishman, B. J., Gomez, L. M. (1997). How activities foster CMC tool use in classrooms. CSCL ’97
Proceedings.
Ganeva, I. (1999). Native and non-native speakers’ participation in educational asynchronous computer
conferencing: a case study. Unpublished Master thesis. Ontario Institute for Studies in Education,
University of Toronto.
Groeling, T. (1999). Virtual Discussion: Web-based Discussion Forums in Political Science. Paper
presented at the 1999 National convention of the American Political Science Association, Atlanta,
Georgia.
Gunawardena, C.N., Lowe, C.A., & Anderson, T. (1997). Analysis of a global online debate and the
development of an interaction analysis model for examining social construction of knowledge in
computer conferencing. Journal Educational Computing Research, Vol. 17(4) 397-431.
Guzdial, M. (1997). Information ecology of collaborations in educational settings: Influence of tool. CSCL
’97 Proceedings.
Henri, F. (1992a). Formation a distance et téléconférence assistée par ordinateur: interactivé, quasi-
interactivé, ou monologue?. Journal of Distance Education, 7(1), 5-24.
Henri, F. (1992b). Computer conferencing and content analysis. In A.R. Kaye (Ed.). Collaborative learning
through computer conferencing: The Najaden papers, 117-136. Berlin: Springer-Verlag.
Jonassen, D. H. (1997). Instructional design models for well-structured and ill-structured problem-solving
learning outcomes. Educational Technology research and Development, 45(1), 65-94.
Jonassen, D.H., & Kwon, H. (2001). Communication patterns in computer mediated versus face-to-face
group problem solving. Educational Technology, Research & Development, 49(1), 35-51.
Land, S. M., & Dornisch, M. M. (2002). A case study of student use of asynchronous bulletin board
systems (BBS) to support reflection and evaluation. Journal of Educational Technology Systems,
30(4), 365-377.
Lim, K.M., & Tan, A. G. (2001). Student teachers' perception of the importance of theory and practice.
Paper presented at the meeting of the Australian Association for Research in Education,
Fremantle, Australia.
119
Newman, D.R., Johnson, C., Webb, B., & Cochrane, C. (1997). Evaluating the quality of learning in
computer supported cooperative learning. Journal of the American Society of Information Science, 48,
484-495.
Rohfeld, R.W., & Hiemstra, R. (1995). Moderating discussions in the electronic classroom. In Z.L. Berge
& M.P. Collins (Eds.). Computer mediated communication and the online classroom, volume III:
Distance learning (pp. 91-104). Cresskill, NJ: Hampton Press.
Swartz, R.J. & Parks, S. (1994). Infusing the teaching of critical and creative thinking into content
instruction. A lesson design handbook for the elementary grades. Critical Thinking Press & Software.
Thoms, K. J., & Junaid, N. (1997). Developing critical thinking skills in a technology-related class. (ERIC
Document Reproduction Service No. ED430526).
120
A Novice Online Instructor’s Online Teaching Experience: A Case
Study
Hee Jun Choi
Ji-Hye Park
University of Illinois at Urbana-Champaign
Teaching is one of the major responsibilities that faculty members perform at a university.
However, a large portion of candidates might get faculty jobs without specific knowledge of or experience
related to teaching. Teaching is often not an easy job for such faculty members. Moreover, the advent of
online educational environments might provide faculty members with even more challenges (Palloff &
Pratt, 1999). In an online environment, faculty members are required to develop competencies in teaching
online in addition to cultivate knowledge and skills for teaching in a traditional face-to-face classroom. For
successful online teaching, it is important to train faculty not only in the use of technology but also in the
art of teaching online (Palloff & Pratt, 2001).
Palloff and Pratt (2001) assert that “teaching in the cyberspace classroom requires that we move
beyond traditional models of pedagogy” (p.20). Online instructors need to apply a unique pedagogical
approach to the analysis, design, development, implementation, and evaluation stages in order to engender
an effective and efficient online learning environment. In other words, online instructors need to devise the
appropriate teaching and learning methods for this new environment in order to make the learning
experience of invisible learners effective. If online instructors try to apply the teaching approach used in a
traditional face-to-face teaching environment to the online environment, it would not be as effective. This
implies that the online learning environment requires a specific and unique educational approach in order to
make both the teaching and learning efficient and effective.
According to Moore and Kearsley (2005), online teaching is different from traditional classroom
teaching in terms of the following: not being able to see students’ reaction to the instruction; reliance on
technology for effective teaching; little peer support; little physical routine in attending classes; and the
presence of collaborative work (i.e., collaboration with the technical assistant, computer program support
staff, program coordinator, and so on). The peculiarity of the additional requisite competencies for an
online environment might be a kind of stress to many faculty members who have never experienced online
teaching.
Issue Questions
Previous studies regarding online learning have focused on learner perspectives, such as student
learning outcomes, their perception of or satisfaction with online learning, and so forth. Only a few studies
have addressed the faculty perspective, which is one of major factors in the success of online learning. The
few studies that claim that online teaching might be a burden to the faculty members are based mainly on
the researchers’ intuitions and reflections, without empirical evidences. Accordingly, this study intends to
explore whether online teaching is actually a burden to faculty members through the first online teaching
experience of a professor at a large university in the Midwestern United States. Consequently, this study
aims to help readers understand the online teaching environment by providing them with opportunities to
vicariously experience the challenges faced by a novice online instructor. It also may serve as an initial
step toward the more effective preparation of a new online instructor. To attain the purposes of this study,
the following questions are addressed:
1. What are the challenges that a novice online instructor faces when teaching her first online
course?
2. Are such challenges in online teaching a burden to faculty members?
121
The design for the courses could be characterized as modular, but they are not just courses broken
into a series of units. The structure is comprised of course sections that are divided into modules. Modules
are divided into learning cycles. A learning cycle includes the following elements: presentation of content,
application of content (activity), and assessment/feedback. This design builds on the concepts of social
learning, metacognition, mental processing capacity, and systems thinking.
The program uses asynchronous delivery, where students can watch a streamed PowerPoint
supported lecture—students hear the instructor giving a lecture while seeing PowerPoint slides. These
lectures are usually fifteen to twenty-five minutes in length. Additionally, students are provided with
learning activities, to be completed either independently or as a team activity, depending on what the
professor requires. Students submit their work electronically in a variety of forms (i.e., email, Webboard,
etc.), according to the professor’s preference.
Synchronous instruction is also provided once per week, in a session lasting approximately one
hour. All students are expected to log onto a text chat system and communicate or interact with each other
and the professor. The professor uses live, streamed audio to talk to the students and to lead discussions.
For the purpose of saving faculty time, instructors are provided with a development assistant who is
responsible for the technical aspects of the course transformation. A team of graduate assistants, who have
technology experience, provides this assistance. It is estimated that development assistance averages fifteen
hours per week for sixteen weeks in each course. During the semester that faculty actually teach or deliver
a course, they are provided with a teaching assistant for ten hours per week.
Findings
The online class started officially precisely at 7 p.m. and ended around 8 p.m. The one-hour
synchronous session was usually spent on housekeeping issues, reminding students of upcoming
assignments and discussing proposed questions. In each session, students were provided with questions
drawn from reading materials and case studies, were broken into four teams, and were encouraged to
discuss those issues. After the group discussion, the class reconvened in the main chat space and shared
their discussions with the other teams.
During the five weeks of observations, the instructor always came to the chat space for the
122
synchronous session much earlier than the scheduled time and greeted each student by name as they
entered the chat space. She usually carried on conversations with students until class began. This behavior
of the instructor was unusual to me. An instructor with whom I have experience did not enter into the chat
space earlier than the scheduled teaching time to communicate with students even though he arrived at the
office early. He used to enter the chat space two or three minutes before the class began.
As for the individual greeting of the instructor, one student expressed that she was very glad that
the instructor seemed to have affection toward her through the instant messenger. The student also said that
she tried not to miss the class in order to repay the instructor’s interest in her. As an example, she had a
class during her husband’s birthday but delayed the birthday party in order to attend the class. She
confessed that she usually missed class when she had important family affairs.
The remaining four online classes that I observed were very similar to Michelle’s 5th online class.
Michelle’s legs were bouncing whenever she was anxious about students’ apathy, had a technical problem,
or made a little mistake. I asked to myself, “What can I do for Michelle’s legs? What can I do to help
Michelle’s legs relax?”
According to interviews with Michelle and her teaching assistant, the biggest problem that
Michelle had in teaching the online class was to deactivate the links on the Master Schedule, which caused
frustration among the students. Each course in the online program provides a master schedule for the
students. The master schedule includes the most critical information regarding the course (e.g., reading
materials, major content, additional resources, assignments, and due dates) on a weekly basis. Students can
go to hyperlinked pages by simply clicking on the links provided in the master schedule. Students are
normally informed of the due dates and assignment instructions in several places (e.g., the master schedule,
123
the syllabus, the actual assignment description pages). Before the course begins, the instructor, teaching
assistant, and development assistant are very cautious to make the information consistent in all locations.
In this course, some changes were made before the class just began. Because the changes were not prepared
in early enough and with enough caution, the changes were reflected on the master schedule but not on the
syllabus. This oversight caused mass confusion among students in spite of many attempts to explain the
changes in class and via emails. To eliminate the source of confusion, the instructor decided to deactivate
all the links in the master schedule during the third week of the semester.
Apparently the students were not pleased with the change. One student made a comment “this
course was a bit of nightmare” in a group chat space, and the instructor found it while she was reviewing
transcripts of the team chats. The instructor personally approached the student and found that deactivating
the links on the master schedule was the major reason why she had negative feelings about the course. The
instructor expressed her feeling about the incident:
I think I was really kind of shocked because when I had access to the course I made an assumption
again that students, when they would navigate the course materials, would actually go to the
respective session for that particular week and that they would read lecture notes or they would
look at the power point presentations or they would somehow work on reading the chapters.
Then, they would follow all of the instructions for that particular session, module, and then
complete the learning assignments... And what this feedback told me is that students probably
won’t even use the course materials that had been already set up for them. But they were rather
just simply skipping ahead and looking at what are the cycles of assignments that are required for
a particular date and launching right into that and thereby bypassing or ignoring the bulk of the
materials that had been created for them. Some students appeared to be interested in only posting
their assignments to pass the course rather than understanding the contents of the course.
The TA also indicated that the master schedule incident was the greatest difficulty that the
instructor had experienced.
The greatest difficulty that seemed apparent to me was over student frustration…knowing the
syllabus was correct, we deactivated the Master Schedule to eliminate the source of
confusion…However, some students seemed not to like the deactivation of the links on the master
schedule. Some students expressed that the change in the master schedule was inconvenient
because they were accustomed to using the links on the master schedule to have access to the
assignments in the team chat. Some students wanted us to reactivate the links on the master
schedule. Realizing students’ dissatisfaction about the deactivation of the links on the master
schedule in the third week of the semester, we decided to reactivate the master schedule.
This incident implies that some of the students might have seen only the information related to the
assignments through the links on the master schedule, without going through all the course content from the
course homepage. The links on the master schedule might be very helpful for the students’ convenience.
However, it is questionable whether the links on the master schedule are helpful for the students’ learning
or not. Thus, I asked a graduate student who was a development assistant for the online courses her
thoughts about this issue. Her comments are as follows.
If I were the instructor, I would stick to deactivating the links on the master schedule. It will
induce students to navigate and read the whole materials in the course. Ultimately, deactivating
the links on the master might be helpful in students’ learning.
I asked the peer development assistant a further question about this issue.
Me: Students would feel inconvenient if the links on the master schedule continued to be
deactivated. This could affect the course evaluation at the end of the semester. Moreover, the
instructor is an assistant professor, not an associate professor. Can you stick to deactivating the
links on the master schedule in this situation?
124
Jane (An Alias): Well…. I am not sure…. I think that the course evaluation from only students has
a lot of problems. It cannot be reliable…. It might be the sorrows of an assistant professor …
The instructor reactivated the links on the master schedule, so that the students could easily access
the webpage that included the information related to the assignment. As a result, the conflict between the
instructor and the students surrounding the master schedule was solved. Due to this incident, the
administrative staff acquired precious information on how the master schedule was actually used by some
students.
Another difficulty that the instructor experienced in the online course was student apathy.
According to interviews with the instructor and the teaching assistant, student apathy was one of the biggest
obstacles to engaging students in deeper discussion and to making students actively participate in the class
activities. The instructors noticed student apathy mainly during the synchronous sessions:
I think it is really challenging sometimes to know whether they are truly there mentally as well as
physically because I would imagine that there is some degree of invisibility attached to it. Not
only do I not see them and hear them, it’s quite possible that their name appears on my computer
screen but ultimately they might be doing housework or preparing meals or caring for children,
they might be engaging in one on one instant messaging with their peers. This might be a good
example… At the beginning of the class, I clarified the vague points related to the assignment and
then asked students if they have any further questions. At that time, there was not any question
about it. At the end of the class, a few students asked me what I already clarified…Although I saw
their names on my computer screen from a few minutes before the class began, they did not seem
to pay attention to my lecture. Actually, this kind of thing has happened several times so far.
The teaching assistant also indicated student apathy as a difficulty in online teaching, but saw it
mainly in the asynchronous discussions and interactions during the week.
The nature of this online environment does not allow for face-to-face interaction… what led this
feeling the most was the lack of interaction during the week between students and the
instructor…The students rarely responded to each other’s postings, even when they were posting
supplemental material that might have been of interest to their fellow students.
I tried to contact some students via the instant messenger to know their position about this issue.
Fortunately, one student answered my question related to this issue:
Kelly (An Alias): I am taking the class at home. This can be a cause of the problem. For example,
when I was taking the online class, my children had a big fight. At that time, I had to let my
children stop fighting. As a result, I missed the half of the class.
Me: What do you think about students’ apathy in the asynchronous space?
Kelly: Frankly speaking, I usually used the asynchronous space when I need to post the
assignments and should participate in the asynchronous team discussion. As a matter of fact, there
were so many postings in the asynchronous space. Thus, it was impossible to read and answer all
of them.
Kelly: In fact, my peer confessed that she watched her favorite TV show during the synchronous
session. This is a big secret…
125
Me: My lips are zipped. Between you and me, isn’t the class boring?
Kelly: I think every class in the world is boring. (Smile)… Just Kidding! In my opinion, the
courses are well organized, and the instructor’s teaching is also great. The problem is that I am
tired when I take the class. I usually get up at 6: 00 A.M. and go to work around 7:30 A.M. … As
soon as I return home around 5:00 P.M., I have to make dinner. … After eating dinner and
cleaning dishes, I head to my computer to take the online class. One day, I was so tired that I was
dozing off during the class. As a matter of fact, it is not easy for me to submit the assignment by
the due date. … I should also take care of lots of family affairs. Do you know what I mean?
However, I need to finish this program successfully to be promoted at the company.
The instructor and her teaching assistant perceived that students were apathetic during the
synchronous work as well as the asynchronous work at the beginning of the course. In order to solve the
student apathy issue and encourage active participation, the instructor started to use more realistic case
studies as the topic of the synchronous session, starting in week four. The instructor believed that this
method was relatively successful in getting the students to actively participate in the course:
I found that when I actually embedded some cases that I drew from different sources to highlight a
reinforcer in content and get students to think more deeply about certain contents of the chapters,
that worked pretty well. So, the TA and I have discussed the possibility of having the students
actually use realistic cases for every week’s discussion topic. I realized that I need to give the
online students more stimuli or interests to encourage their active participation… In fact, my
offline students actively participated in the class even though I did not provide them with such
stimuli... I should have figured out the characteristics of online students in advance.
During the first three weeks of the course, the number of student responses in the synchronous
chat space was 108, 119, and 110, respectively. This number increased by 20-60 in the following
synchronous sessions of the semester. This supports the instructor’s perception of success in overcoming
student apathy.
The instructor indicated that it was also difficult to implement multi-tasks, such as rapidly reading
multiple messages from students, promptly typing the key points of comments, and verbally giving
feedback at the same time during the synchronous sessions. Instructors in a synchronous online
environment are required to have quick reading, typing, and surfing (navigating) skills and should
sometimes use such skills at nearly the same time. Michelle summarized her experience as follows:
I remembered that the very first evening, I introduced myself, reviewed the syllabus, reviewed kind
of general expectations, and then got them engaged a little bit in the content of the initial chapters.
And then to have them think a little bit more deeply about some of that content, I posted a few
questions on the chat space and there was this kind of awkward silence for a moment and I sat
there wondering if I was maybe not being effective. And then with the time delay, all of the sudden
the screen was just completely full. And then that was the matter of having to be really quick
visually to scan the screen full of comments that multiple students had typed in. Plus at the same
time to be processing that, not only from a mental standpoint but then being able to articulate
verbally almost at the same time that I was kind of trying to synthesize it mentally, and then also
tying in my own commentaries. So in lots of ways it is a very highly stimulating environment, but
one that requires you to be pretty fast in terms of your reading, synthesizing, and internalizing
skills. In the face-to-face class, I did not need to worry about these because I could just make
student ask me questions one by one. However, it was not easy to control this in the online
environment.
The teaching assistant also pointed out the fact that an instructor in a synchronous online
environment must be able to multi-task with rapid speed. While the instructor considered the synchronous
hour as very challenging—requiring extensive multi-tasking, the teaching assistant indicated that the
instructor was able to deal with it. The teaching assistant perceived that the instructor seemed to adjust to
126
the multi-tasking within a relatively short time:
She is so agile and adapts herself to her environment quite fast. This was very helpful in the online
class because she was able to quickly read comments posted by students and respond to them by
name. She is one of the most excellent facilitators that I have ever seen at the university. Frankly
speaking, another instructor who I have experienced had to spend much more time in adapting
himself to the multi-tasking environment. However, Michelle was a very fast learner.
Learning how to use the technologies and understanding the course framework were a challenge
and burden that Michelle faced as a novice online instructor. She had to understand how the online course
was designed and become familiar with the technology and learning system used in the course to make her
instruction more efficient and effective. As presented below, this was a challenge to her in preparing for the
online class:
I really had to get familiar with what was this going to look like and if I were a student what
would I see? What would the WebBoard be like? What would the syllabus be like? How would the
course sections be broken down? What would the learning modules involve? How would the
learning cycles be conducted? So that was kind of a huge project, almost literally going through
week by week, section by section, module by module, learning cycle by learning cycle to print out
all those materials, which created a binder, so I had a good sense of feel for the class.
Consequently, the online class required me to invest additional hours in preparing it.
The instructor learned most of prerequisite skills for the online course before the course began,
mainly through a tutorial for new online instructors and teaching assistants. However, the tutorial did not
provide all of the answers. She came to the office to ask questions of either the coordinator or the
development assistants quite often. The exact number of her visits cannot be provided, since I was not
initially involved in this study and did not count the visits at the time. In spite of these efforts, during the
first synchronous session she could not remember how to go to the team chat space. In the third week of the
course, however, she was able to lead the synchronous session without asking the development assistant
any technical questions. Real experience seemed to be much more valuable than practice in a tutorial.
Michelle’s face-to-face course met three hours each week. However, the online course met
synchronously for only one hour each week. This might imply that an online class should use unique
instructional approaches that are different from those in a face-to-face course. However, the instructor
seems to have managed the online class with similar instructional approaches to those in her face-to-face
class. In the one-hour online class, the instructor tried to cover the same content that she addressed for three
hours in her face-to-face class, which was challenging to her:
I find it somewhat frustrating because of the one-hour session that we usually are limited to. And
one issue has been desired to want to continue the dialogue in the main chat room and I often feel
quite awkward in having to bring to closure usually at 10 after 8 or 8:15, we have even gone as
late as 8:30. So, that can be a little bit challenging as well.
Finally, she indicated difficulties in measuring student outcomes in the online environment. The
course had two exams: a mid-term and a final. These exams were intended to evaluate how much of the
content of the course students had learned. In the online environment; however, it was not easy to
administer an exam since students can easily access course materials and textbooks at home. She pointed
out this difficulty:
Exams are going to have to be open-book, open-text, open-note in an online class because I simply
cannot monitor them, the students in their home environments, to ensure that they are not
referring to such materials, whereas in a face-to-face format I can very easily make it a closed text
exam or I can maybe make the assignments a little bit more intensive where they actually have to
begin to apply and think at a slightly deeper level, because they won’t have access to notes and
materials which would be the case in an online format. Moreover, I did not have the privilege to
change the evaluation method because I took over the course for one semester while the original
instructor was on sabbatical. As a result, I had to administer exams that were not appropriate for
127
an online environment.
The instructor also indicated that online teaching involved a heavy workload overall, resulting in
one of the most critical challenges in teaching the course:
I do think that the workload is heavier in an online course because, without face-to-face contact as
in a typical face-to-face lecture/discussion type of course, I found myself spending considerable
time trying to make connections with students by responding to all of their individual emails and
postings, even to those that went into the teaching assistant account which were forwarded to me.
The online class required me to have much more asynchronous individual contact with students
than those in the face-to-face class. Not seeing each other had students have much more questions.
The invisible aspect of an online environment seems to yield much more questions and curiosity.
The online students asked me even trivial and private questions via emails or bulletin board.
The teaching assistant also indicated that the heavier workload in the online environment seemed
to be a big burden to Michelle, as follows:
She had great excitement and enthusiasm after the first night but now I would say she is exhausted
with it…. She told me that she was glad the semester was coming to an end… was looking
forward to the burden of this class to be over, so that she could resume her writing.
Although online learning has many benefits, Michelle, a novice online instructor, seemed to
consider online teaching to be a big burden that requires a heavy workload as well as flexibility.
Reflection
This study delineates the first online teaching experience of an instructor at a large university in the
Midwestern United States. According to the findings, online teaching generated some challenges for the
instructor, and such challenges did indeed cause her to consider online teaching burdensome. The
challenges that the new online instructor faced can be summarized as follows.
First, the instructor experienced difficulties in communicating with, interacting with, and facilitating
students. Because of some miscommunication with the students, she experienced student complaints and
had to modify her course management. In addition, because she experienced student apathy during the
synchronous sessions, she had to create new teaching materials to facilitate students’ interaction and
participation. Her experiences illustrate what many scholars have contended: that interactive
communication and facilitation are critical factors in accomplishing successful online teaching (e.g.,
Moore, 1989; Moore & Kearsley, 2005; Williams, 2003). In particular, Williams identified thirty general
competencies for distance education programs in higher education institutions. Eight out of the thirty
general competencies and three of the first five ranked competencies are related to communication and
interaction. Therefore, an online instructor, particularly a novice instructor, should be aware of the
importance of communication and interaction with students and thus prepare for enhanced facilitation
during the course planning. Pairing experienced online instructors with novice online instructors would be
an effective strategy to advise about what works and does not work for effective communication and
interaction (Palloff & Pratt, 2001). In addition, exposing novice online instructors to a variety of case
studies would be another strategy to help them to establish more effective communication and interaction.
A second challenge was becoming accustomed to the technology and to the unique necessity of
multi-tasking skills during synchronous sessions. As the study subject’s experience indicated, online
instructors need to have certain technology-related competencies such as basic technical knowledge,
technology access knowledge, software knowledge, and multimedia knowledge (Williams, 2003). Even
though Michelle thoroughly completed the tutorial developed for new instructors and teaching assistants,
she needed real practice (i.e., actual synchronous sessions) to get used to the technology and environment.
In this vein, it is also meaningful to establish a faculty development laboratory as a place to try out and
practice the technology, as Barker and Dickson (1993) suggest.
The instructor also expressed difficulties in organizing the one-hour synchronous sessions because
she felt they were too short to cover the content or to measure student outcomes. To overcome these
problems, the instructor needs to use different teaching strategies and to develop alternative assessments,
such as portfolios, projects, and problem-solving activities. When an instructor teaches a one-hour session
128
as well as a three-hour session on the same topic, it is certain that the instructor will use different strategies
to attain the same learning outcome. We need to pay attention to the argument that the role of instructors
should shift from a sage on the stage to a guide on the side in order to design and manage an online course
effectively (Moore & Kearsley, 2005; Palloff & Pratt, 2001).
Ultimately, the challenges that this new online instructor faced made her think that online teaching
involved a heavy workload, and such challenges and consequent heavy workload made her exhausted with
online teaching. As a result, the instructor seemed to have a more negative impression about online
teaching than positive. Apparently, online teaching was a burden to this new online instructor.
If the new online instructor had had training regarding the pedagogical issues of online teaching
and vicarious experiences through experienced online instructors, she could have been better prepared and
had a different impression about online teaching. This implies that training for online instructors should be
designed with more focus on the pedagogical issues of online teaching and on vicarious experiences with
the actual practices rather than on technical issues.
Many instructors think that teaching online is merely a change of environment and apply the same
methods from traditional classroom teaching to the online teaching environment, especially in the design,
development, and delivery of content. As this study shows, however, it is evident that instructor roles and
teaching strategies are different in online environments compared to the traditional classroom environment.
Online instructors should develop not only their technical skills, but also the appropriate teaching strategies
for an online environment, in order to minimize the challenges that they face. Online instructors cannot be
expected to know these strategies intuitively or automatically (Palloff & Pratt, 2001). Institutions offering
online courses and hiring inexperienced online instructors should provide them with appropriate training
and extensive support so that the instructors can better understand the new teaching environment and
design and deliver more effective online courses. This study is also an effort to help new online instructors
to understand the online environment through the vicarious online experience.
This study, however, relies mainly on one instructor’s experience at a university. There could be
more or different kinds of challenges that new online instructors face. Further case studies with different
instructors and in different institutions will be able to provide more diverse and more practical information
about instructors embarking for the first time in an online course.
References
Barker, B. O., & Dickson, M. W. (1993). Aspects of successful practice for working with college faculty in
distance learning programs. Ed Journal, 8(2), 1-6.
Moore, M. G. (1989). Three types of interaction. The American Journal of Distance Education, 3(1), 1-6.
Moore, M. G., & Kearsley, G. (2005). Distance education: A systems view (2nd ed.). Belmont, CA:
Wadsworth Publishing Company.
Palloff, R., & Pratt, K. (1999). Building learning communities in cyberspace. San Francisco: Jossey-Bass.
Palloff, R., & Pratt, K. (2001). Lessons learned from the cyberspace classroom. San Francisco: Jossey-
Bass.
Smith, T. C. (2005). Fifty-one competencies for online instruction. The Journal of Educators Online, 2(2).
Retried September 27, 2005 from http://www.thejeo.com/Ted%20Smith%20Final.pdf
Williams, P. E. (2003). Roles and competencies for distance education programs in higher education
institutions. The American Journal of Distance Education, 17(1), 45-57.
129
How can Technology Help Improve the Quality of Blackboard Faculty
Training and Encourage Faculty to Use Blackboard?
Doris Choy
Judy Xiao
John Iliff
College of Staten Island
Abstract
Since course management systems were introduced to colleges and universities many years ago, faculty
members who are highly motivated and interested in the systems have adopted them into their curriculum.
However, there are still many who have not utilized these tools for their courses. Some faculty members
feel that course management systems do not fit their courses while others feel that there is not enough
training. What could be done to motivate those faculty members to give them a try? What kind of training
could provide better training to them? The purpose of the study is to ascertain information on how to
encourage more faculty members to use course management systems and to improve the quality of faculty
training. The results from focus groups showed that small group learning instead of large classrooms and
anytime anyplace online tutorial support might be the solution.
Literature Review
Course management systems, such as Blackboard and WebCT, are defined as “software packages
designed to help educators create quality online courses.” These systems are increasingly popular among
colleges and universities in the United States. For students, Blackboard and WebCT allow them to interact
with the instructor and other students at anytime and anyplace. Students are able to access course
information 24/7 when they have access to a computer and Internet connection. Therefore, students showed
positive attitudes towards the course management systems (Basile & D’Aquila 2002). For faculty, the
systems are helpful in many routine classroom management tasks like online grade books, sending
electronic mails and announcements to students. The systems also help to promote effective
communications between students and faculty members by providing synchronistic and asynchronistic
interactions possibilities. The pre-developed frameworks within these systems save faculty members’s time
and resources since they do not need to develop their course web sites from scratch. With the increasingly
diverse student population, there is an increase in variety of learning styles and preferences. Adopting new
technology can better meet the learning needs of 21st century students.
Many colleges and universities enjoyed a rush of early adapters of course management systems by
faculty who were termed compassionate pioneers (Feist 2003). However, this rush turned to a trickle
quickly as a majority of faculty resisted adaptation. Bennett & Bennett (2003) claimed that 80% of public
4-year colleges make course management tools available to their faculty members, but only 20% of the
faculty in those institutions adopted the systems. The reluctance is caused by a variety of factors. Faculty
members reported that they do not want to take on the systems because: lack of time; training is a one-shot
session that is not followed-up or does not provide ongoing support; and training is not active and does not
have opportunities to practice (Feist 2003). Robinson (2004) also suggested that the main factor of low
faculty usage is the lack of effective individualized training. Statistics of current course management
system availability (80%) and usage (20%) showed that there is a need to investigate how to improve
faculty usage rate. What are the needs of the 80% faculty members who are not utilizing the available
resources.
This study was conducted at an urban public university campus. The percentage of usage is similar
to the results presented by Bennett & Bennett (2003). Blackboard was introduced as the course
management system since the Fall semester of 2001. In Fall 2004, about 50% of the faculty members had
some form of formal training, but only about 15% of total available courses employed Blackboard. Faculty
members expressed that they did not adopt Blackboard because immediate support is not available; they
cannot recall what they have learned from the training; and Blackboard features are not applicable to their
courses.
130
Methodology
The purpose of this study is to find out how to motivate the faculty members to utilize Blackboard
in their courses. The faculty can be divided into two groups: current users and non users. First, the study
tries to investigate the needs of non users and try to develop training that could motivate them to try to use
Blackboard. Second, it is important to find out what the needs of current users are and what could be done
to encourage their continuous usage of the system. Therefore, there are two main research questions in this
study:
1. What could be done to motivate the faculty members who are reluctant to use Blackboard in
their classes?
2. How can we provide better support to the current Blackboard users so they could explore
other features in Blackboard that they have not tapped into yet?
In order to collect information from faculty members who have previous experiences in using
Blackboard, e-mail invitations were sent to all faculty members who have used Blackboard in their
courses. The e-mail explained the purpose of the study and invite faculty members to participate in a focus
group to discuss their experience in training and using Blackboard. Among those who responded, faculty
members with various experiences and backgrounds were selected to join the focus group session. The
focus group session lasted for about two hours, with an additional hour of informal conversations between
the participants and the researchers individually. All the conversations were recorded and transcribed for
data analysis.
Based on the two main research questions, the following questions were derived as the guiding
questions for the focus group session:
a. From your experience, what are the strengths and weaknesses of Blackboard?
b. What did you do when you encountered problems in using Blackboard?
c. How did you learn Blackboard? What do you think is the most effective method?
d. What would you expect from Blackboard faculty training in the future?
e. What changes would you recommend to improve the overall quality of the faculty training
program?
f. What could be done to motivate more faculty members to use Blackboard?
In order to collect in-depth information from the faculty members, focus group was adopted for
this study. Focus groups help to generate individual as well as small group in-depth qualitative data that is
useful to understand the needs of the faculty members (Krueger and Casey 2000). All recordings from the
sessions were transcribed and coded. Qualitative analysis was carried out to identify patterns (Strauss &
Corbin 1998).
Participants
There were seven participants in the focus group. Faculty members came from different
departments on campus including: English (2 faculty), Education (1), Computer Science (1), Business (1)
and Nursing (2). The participants’ experience with Blackboard and their level of competency with the
system also varied. One of the English faculty members was a veteran Blackboard user who helped to
choose Blackboard over other course management systems for the university when it was first introduced.
While two other faculty members were first time Blackboard users. Others years of usage ranged from two
to five semesters. Among the group, there were four male professors and three female professors. The rank
of the faculty members were: two tenured faculty, three assistant professors, and two adjunct professors.
Adjunct faculty members were also invited to the focus group because they teach about 30 percent of
courses. They also have access to Blackboard and were invited to participate in training sessions. However,
few adjunct faculty members choose to participate in the training session at the beginning of the semester.
As a result, the average usage of Blackboard by adjuncts is lower than full time faculty. The two adjunct
faculty members were from different departments. One of them has used Blackboard for three semesters
while the other was a first time user. The participants in the focus group consisted of different types of
faculty members in a typical college campus. Therefore, different viewpoints could be heard to yield
valuable results.
Results
The focus group believed that in order to motivate those who are not using Blackboard yet,
training has to be developed and conducted differently than the existing large group, 20-25 faculty
131
members per session, classroom setting approach. The change can be divided into two main categories:
Facilitation and Support.
132
• Building an online community of Blackboard users. Faculty members can post their questions and
all users can discuss their experiences through online collaborations. Faculty members can also
provide suggestions to develop tutorials in topics of their interests. Furthermore, they can
showcase their Blackboard course website and share their ideas with colleagues.
Discussions
As the results of the focus group suggested, faculty members are highly interested in having better
support for Blackboard after the initial training. There are many valuable tools in Blackboard that would fit
different courses and settings. However, it is impossible to demonstrate all the features in one training
session. It is also not feasible for faculty members to attend multiple training sessions. Therefore, online
anytime anyplace tutorial becomes the main recommendation that emerged from the study.
The changes in technology have made the development of high quality online training materials
significantly easier. With the help of software like Macromedia Flash, Captivate, or Camtasia; instructional
designers can capture the click by click mouse action on the screen and transform it into an animated
tutorial. Audio materials, text captions and narrations can be incorporated. To enhance the interactivity of
the video, reflective questions could be added to prompt learners’ responses. Moreover, by using the
navigation bar, the learners can control the speed of the tutorial and learn at their own pace. The completed
animated tutorial can be compressed into a manageable file size and be published to the Internet.
By using the software mentioned above, the researchers are looking into the possibility of
developing a series of online tutorials on many of the valuable features in Blackboard. The expected result
is to have an extensive collection of animated online tutorials that would show the faculty members the
step-by-step procedures for each feature. Faculty members will be able to access the online tutorials at
anytime, anyplace and search for specific tutorials that are applicable to their courses. They can also review
the animated tutorial at their own pace and practice the features in their courses.
Conclusions
In order to motivate faculty members who are not using Blackboard, there is a need to revise the
training method. By using small group facilitations and anytime anyplace online tutorial support, faculty
members will be more likely to explore Blackboard. The small group facilitation setting could be tailored to
the needs of specific faculty members and relevant features could be demonstrated in the session. The
setting would also encourage them to interact more frequently with the trainer and the other faculty
members. The online tutorial would be the next step that could be implemented to provide faculty with
continued support.
133
Many instructional designers and trainers agree that conducting individualized trainings are not
feasible to accommodate a large number of faculty members. However, this focus group study suggests
small group facilitations, together with the support of online tutorials, could be effective in motivating
faculty to adopt Blackboard. This study is not trying to convince every faculty member to use Blackboard,
but it is important that multiple training formats be developed to engage faculty with various learning
styles, and that faculty be given the opportunity to see and explore the potential benefits of adopting this
technology into their curriculum.
References
Basile, A. & D’Aquila, J. M. (2002). An experimental analysis of computer-mediated instruction and
student attitudes in a principles of financial accounting course. Journal of Education for Business,
77(3), 137-43.
Bennett, J. & Bennett, L. (2003). A review of factors that influence the diffusion of innovation when
structuring a faculty training program. Internet and Higher Education.6, 53-63. Retrieved July 1,
2004 from Science Direct Elsevier Science
Feist, L. (2003). Removing barriers to professional development. THE Journal, 11, 30-35. Retrieved May
20, 2004 from EBSCO Academic Search Premier.
Krueger, R. A., & Casey, M. A. (2000). Focus Groups: A Practical Guide for Applied Research. Sage
Publications.
Lincoln, Y. S., & Guba, E. G. (1985). Naturalistic Inquiry. Sage Publications.
Patton, M. Q. (1990). Qualitative Evaluation and Research Methods. Sage Publications.
Robinson, P. (2004). Individualized Training Brings Mainstream Faculty on Board. Distance Education
Report, 8(4), 1-6.
Strauss, A. L., & Corbin, J. M. (1998). Basics of Qualitative Research: Techniques and Procedures for
Developing Grounded Theory. Sage Publications.
134
Trends, Issues, and Guidelines for the Quarterly Review of Distance
Education (QRDE)
Daniel Eastmond
Western Governors University
Charles Schlosser
Michael Simonson
Nova Southeastern University
Introduction
The Quarterly Review of Distance Education (QRDE) is in its sixth year of publication. The
QRDE is an official publication of the Association for Educational Communications and Technology
(AECT). The Quarterly Review of Distance Education is a rigorously refereed journal publishing articles,
research briefs, reviews, and editorials dealing with the theories, research, and practices of distance
education. The QRDE publishes articles that utilize various methodologies that permit generalizable results
which help guide the practice of the field of distance education in the public and private sectors. The QRDE
publishes full-length manuscripts as well as research briefs, editorials, reviews of programs and scholarly
works, and columns.
The QRDE defines distance education as institutionally based, formal education, where the
learning group is separated and where interactive technologies are used to unite the learning group. This
definition of distance education is recognized by AECT.
An analysis of the first five volumes (five years) of the QRDE has been completed and is
summarized in this paper and presentation. The review of volumes 1 – 5 was conducted by QRDE editors.
First, each issue of each volume was analyzed. The results of this process are reported in tables 1-5. Next,
the guidelines for submission of articles are presented and a general commentary on the first five years of
the QRDE is provided. Finally, the book review section of the QRDE is discussed and significant issues
and trends are provided.
Table 1 – Volume 1
Number of Articles
5 4 5 5 ~5
Number of Pages/Article
10.8 12.2 8.2 10.8 ~10.5
Keywords
135
Culture Systems Professional Development
Future Interaction Student Support
Evaluation Learning Environments Technologies
Instructional Design Motivation Theory
Approach
Number of Articles
6 5 4 6 ~5
Number of Pages/Article
10.8 15.2 11.5 12 ~12.4
Keywords
Evaluation
Instructional Design
Student Support
Technologies
Theory
Learning Environments
Instruction
Approach
Research = ~38 %
Foundation/Theory = 57 %
Evaluation = ~ 5%
136
Table 3 – Volume 3 – 2003
Number of Articles
9 6 6 6 ~6
Number of Pages/Article
12.7 14.1 11.2 11.5 ~12.4
Keywords
Approach
Research = ~30%
Foundation/Theory = ~70 %
Number of Articles
5 5 13 9 ~8
Number of Pages/Article
11 15.6 11.3 10.2 ~12
137
Keywords
Approach
Research = ~31 %
Foundation/Theory = ~31 %
Evaluations = 18 %
Training = 2 %
Design = 18 %
Number of Articles
6 6 5 5 ~5.5
Number of Pages/Article
9.7 10.7 10.7 12 ~10.8
Keywords
Design
Learning Objects
Training
Scorm
Student Participation
Mentors
Evaluation
Learning Communities
Interaction
Confidence
Training
Attitudes
Attrition/Completion
Blended Courses
138
Approach
Research = ~41 %
Foundation/Theory = ~32 %
Evaluation = ~9 %
Training = ~4 %
Design = ~14 %
The QRDE Directions to Contributors are straightforward and entirely ordinary. In brief, they specify that:
• Four copies of the manuscript are to be submitted
• Manuscripts should be between 10 and 30 double-spaced pages
• APA style should be observed
• Names, affiliations, and other identifying information should be on a separate page
• A brief (approximately 100 words) abstract should be included
• Graphics should be in a separate file
• The printed documents should also be submitted on a CD or PC-formatted floppy using Microsoft Word.
A separate version, saved as an .rtf file, should also be included.
The Directions note that manuscripts are reviewed by at least three consulting editors, and the process usually takes
3 to 4 months. The Directions close with the name and address of the editor, Mike Simonson, to whom manuscripts
are sent.
What happens next is rather more interesting: submissions are acknowledged immediately, they are sent to
three reviewers within one week of receipt, and reviewers are directed to complete their review within one month.
So, six to eight weeks after submission, Mike has received all the reviews, which normally break down this way: 10
percent of the articles are accepted without changes, 20 percent are rejected, and 70 percent are returned to the
contributor for revision. Of those 70 percent, some are resubmitted within days, while some are resubmitted…well,
never. The eventual acceptance rate is approximately 50 percent. Accepted papers are assigned to an issue that will
be published in about nine months.
But, before the manuscript is ready for publication, there is much to be done, by the editorial staff. After
Mike gathers the accepted manuscripts (usually five articles, two book reviews, and two columns), he puts all the
files on a CD and FedExes them to me. I edit all the manuscripts, ensuring they’re appropriately formatted, that
spelling, grammar, and sentence structure are acceptable, reading carefully for factual errors and alert to less-than-
elegant writing.
When all manuscripts have been edited, the issue is uploaded to the publisher’s server and downloaded by
the publisher’s editor, who invariably identifies errors and omissions. This issue is then typeset, saved as .pdf files,
and reposted to the server. At that point, QRDE’s editorial assistant, Jack Daugherty, sends the articles (as email
attachments) to their authors, with a request to carefully examine the document, address any shortcomings identified
by the editor, indicate any additional changes, and return them to me. Then, I amend each of the articles, Jack and I
give the issue one more read, and I again post it to the publisher’s server. The publisher’s editor makes all the
required corrections, and the issue is printed and mailed.
Sometimes, the whole process goes without a hitch, but minor glitches are common and fall into several
categories:
• Missing information. Amazingly often, manuscripts have incomplete reference lists. And, not
uncommonly, contributors fail to include complete contact information.
• Incorrect information. Again, it’s amazing how often reference lists lack important information
such as city of publisher, page numbers, and so on.
• Incorrect format. A fair number of our contributors seem to have not consulted the APA
publication manual when preparing their manuscript. This is most apparent in the citation of
sources and preparation of reference lists.
• Strange styles. Use of Word styles other than “normal” can slow the editing process considerably.
These styles must be undone, which tends to change the look of the document in unfortunate
ways, and the process can be aggravating.
An interesting and rewarding variation in the usual process of editing the Quarterly Review is the guest-
edited issue. The journal has averaged one of these special-theme issues each year, and guest editors Gary Anglin,
Les Moller, Atsusi Hirumi, Ryan Watkins, and Lya Visser have helped produce some of our strongest issues. The
guest editors identify the authors and the central theme to be addressed. They collect the manuscripts, conduct a
first edit of each, write an introduction to the issue, and send the package to me for further editing. Working with
those guest editors and all the other contributors is, for me, the most rewarding aspect of editing the Quarterly
Review.
Themes
Having read each of the reviews anew, what follows are the book review editor’s observations of the type
of books QRDE addressed and what they tell us about trends and issues. First, an important category of book -- the
survey textbook – proclaims that distance education is an established field of practice. These books are written
primarily to prepare student scholars and practitioners by outlining theory, research, definitions, and practices. To
support the instructional use of these texts, the authors have prepared additional resources such as websites, videos,
and PowerPoint presentations. An example of such a book review is Karen Murphy’s 2000 review of Simonson, et
al.’s Teaching and learning at a distance, which provides a foundational textbook for graduate study.
Another book category involves case studies, particularly from a single institution. In these books,
pioneers report their involvement in distance education activities. Such books have been written about Temple
University, the British Open University, or were developed from conference proceedings. Book reviewers often
criticize these books as portraying a fragmented view of practice that is usually not generalizable to other contexts.
One important exception was portrayed in Mark Hawke’s 2005 review of the edited book by Duffy and Kirkley,
Learner-centered theory and practice in distance education, where the authors intentionally brought together selected
professionals through grant support to educate them about problem-based learning; afterwards the participants
applied these principles to initiatives at institutions across the country and reported back the results.
A common book genre deals with roles – that of student, instructor, trainer, manager, designer,
administrator, and technologist. These texts seek to inform practitioners with best practices, frameworks, guides,
and resources. One such example is Byron Burham’s 2002 review of Managing technological change by Bates. A
similar type of book the Journal has reviewed falls into the “how to” category. These books cover curriculum
design, e-moderating, e-learning design, learning community formation, and teaching via various distance delivery
systems. Kim Dooley’s 2000 review of Cyrs’s (2000) Teaching at a distance with merging technologies presents an
example; the book aims to prepare instructors to teach via interactive television.
Other books deal with specialized areas of distance education, such as those about online writing centers,
blended learning, online nursing programs, libraries, copyright, assessment, and online science labs. An example
140
was Phil Schmit’s review of Linn, et al (editors)’s Internet environments for science education. Similarly, there are
books written about distance education in specific contexts: high schools, higher education, corporate training,
consortia, and international settings. Such a review was Ross Perkin’s 2004 review of Perraton’s Open and distance
learning in the developing world. This book presents the quiet host of distance initiatives worldwide as innovative
responses to educational challenges at all levels.
Finally, the Journal reviews books that are not directly written about distance education but have strong
implications for practitioners. An example of such a review is Charlotte Farr’s 2002 critique of Brown and Druid’s
The Social Life of Information – a book which posits that information technology is only useful in social contexts.
Such books explore communications, societal matters, global affairs, organizational development – usually taking a
technology focus, such as the impact of the Internet on education.
Observations
I couldn’t help but feel after reading these reviews that distance educators are truly renaissance people. To
effectively practice their craft and keep up with the field, they must read broadly – as the range of book reviews
depict. Perhaps trendy, but nonetheless a reflection of an explosion, the delivery of distance education via the
Internet in US higher education and corporate training contexts predominates this literature. Missing are texts about
other delivery systems, reports from other countries, and books about other contexts (e.g., military, medicine,
government, etc.).
References
B. Burnham (2001). [Review of the book Managing Technological Change]. Quarterly Review of Distance
Education, 2(2), 175-178.
K. Dooley (2000) [Review of the book Teaching at a distance with merging technologies: An instructional systems
approach]. Quarterly Review of Distance Education, 1(2), 169-172.
C. Farr (2002). [Review of the book The social life of information]. Quarterly Review of Distance Education, 3(4),
465-468.
M. Hawkes (2005). [Review of the book Learner-centered theory and practice in distance education: Cases from
higher education]. Quarterly Review of Distance Education, 6(3), 275-279.
R. Perkins (2004). [Review of the book Open and distance learning in the developing world]. Quarterly Review of
Distance Education, 5(3), 215-218.
K. Murphy (2000). [Review of the book Teaching and learning at a distance: Foundations of distance education].
Quarterly Review of Distance Education, 1(2), 173-176.
P. Schmidt (In Press). [Review of the book Internet environments for science education]. Quarterly Review of
Distance Education.
141
Exemplary Technology Use: Teachers' Perceptions of Critical Factors
Peggy A. Ertmer
Anne Ottenbreit-Leftwich
Cindy S. York
Purdue University
Exemplary technology-using teachers are defined as those who employ technology in learner-centered,
constructivist environments as opposed to traditional teacher-directed environments (Ertmer, Gopalakrishnan, &
Ross, 2001; Newby, Stepich, Lehman, & Russell, 2000). In general, a constructivist learning environment engages
students in authentic, collaborative tasks, based on their interests. Within this type of environment, technology is
used as a tool to support learners’ engagement with the content, ultimately prompting them to use higher-level
thinking skills (Becker, 1994; Ertmer et al., 2001). According to Berg, Benz, Lasley, and Raisch (1998), this is due,
in part, to technology’s ability to provide students with the tools “to actively process new information, to transform
it, and to ‘make it their own’” (p. 120).
142
that develop through years of experience.
While researchers have delineated a number of important characteristics of exemplary-technology using
teachers, it is unclear whether any of these characteristics are essential for teachers to become exemplary. For
example, while 75% of the exemplary users in the Hadley and Sheingold (1993) study had extensive teaching
experience (more than 13 years), only 59% of the participants in the Ertmer et al. study (2001) had this many years.
Additionally, while 50% to 75% of the participants in Becker’s study (1994) had accumulated a large number of
credits beyond the bachelor’s degree, only 35% of the participants in the Ertmer et al. study (2001), had reached this
level of education. This suggests that either these “requirements” have gradually evolved as technology has become
more embedded in our lives, or that these types of characteristics are not essential to exemplary technology use. It is
important to determine which enablers, if any, have the potential to exert the strongest influence over teachers’
abilities to use technology in exemplary ways so that pre- and in-service teacher educators can support the most
fruitful paths to accomplished use.
Methods
An online anonymous survey was used to explore the perceptions of exemplary technology-using teachers
regarding the factors that influenced their technology integration success. Participants were selected from five
Midwestern technology educator award programs. The award winners were emailed an invitation to participate in the
study, including a link to an online survey that was available via a secure server. Both quantitative (correlations, t-tests)
and qualitative (pattern seeking) analysis methods were used to examine teachers’ perceptions of the factors that
influenced their technology integration success.
Procedures
The study was designed and implemented by a research team consisting of two doctoral students and one
faculty member from the Educational Technology program at a large Midwestern university. All three researchers
had a background in K-12 education and had taught courses related to technology integration for pre-service
teachers. In addition, one of the doctoral students was a previous recipient of an exemplary technology teacher
award.
The researchers collected email addresses from five award program websites and established a database of
possible participants. The sample consisted of recipients of exemplary technology-using teacher awards from five
different organizations within the Midwest, selected due to the researchers’ familiarity with the programs and
organizations. These organizations included the Michigan Consortium for Outstanding Achievements in Teaching with
Technology (MCOATT), Michigan Association for Computer Users in Learning (MACUL), Ohio SchoolNet (OSN),
Illinois Computer Educators (ICE), and Indiana Computer Educators (ICE). In general, participants were nominated for
the award based on criteria related to their ability to use technology in innovative ways and to encourage meaningful
student use. From the initial sample of 48 educators, 25 responded to the survey for a 52% return rate. Identified
participants were emailed twice, once for the initial invitation and once as a reminder. Our final sample included
teachers who ranged in years of teaching experience from 3 to 32 years, with an average of 16 years. The majority of
educators were female (n=16) and had completed their masters degrees (n=20). About half of the participants (n=12) had
been teaching 13 years or less, and all participants rated themselves as having very high (n=16) or high (n=9) computer
skills.
Survey Instrument
The 18-item survey included six demographic questions, two Likert-scale items (consisting of 20
143
subcomponents), eight open-ended items, and one checklist item (consisting of nine subcomponents). For example,
participants were asked to “describe your most memorable or most useful professional development experience,” and
“If you could put your finger on one thing that influenced you the most in terms of integrating technology in your
classroom, what would that one thing be?” In addition, participants rated their perceptions of the importance of both
intrinsic (e.g., inner drive, beliefs, attitudes) and extrinsic (e.g., professional development, resources, and support)
factors on a 5-point Likert scale (from 1, not applicable; 2, not influential; to 5, extremely influential).
The survey was developed after reviewing similar surveys in the literature (Bullock, 2004; Hadley &
Sheingold, 1993; Iding, Crosby, & Speitel, 2002; Lumpe & Chambers, 2001) as a means of establishing construct
validity. Expert reviewers, including an Educational Technology faculty member and an elementary school principal,
provided suggestions for improvement. The final survey instrument incorporated these changes, including wording and
specific details to assure that the items were relevant to exemplary technology-using teachers, thus assuring some
measure of face validity. The survey had a Cronbach alpha of 0.76, suggesting that the survey was moderately reliable.
Data Analysis
In order to answer our first research question regarding teachers’ perceptions of the factors that most
influenced their technology integration success, we calculated means and standard deviations for each of the factors
included on the survey and then rank ordered them from highest to lowest.
To determine whether intrinsic or extrinsic factors were perceived as playing a more influential role, a
paired samples t-test was used to compare teachers’ perceptions of the importance of extrinsic factors (e.g.,
professional development; influential people; administrative, parental, peer, and technology support; Internet,
hardware, and software access) versus intrinsic factors (e.g., inner drive, personal beliefs, commitment, confidence,
and previous success with technology). Triangulation data were provided through teachers’ responses to the survey
question: “If you could put your finger on one thing that influenced you the most in terms of integrating technology in
your classroom, what would that one thing be?”
Pearson product correlations were calculated to determine the relationships among different teacher
characteristics (e.g., gender, highest degree earned, years of teaching experience, and current level of teaching
assignment) and their perceptions of the importance of intrinsic vs. extrinsic enablers. In addition, an independent t-test
was used to examine whether teachers, with more or less years of teaching experience, differed significantly in their
perceptions of the importance of intrinsic and extrinsic enablers.
Results
When teachers were asked to rate the level of influence of each enabler on their success as exemplary
technology-using teachers, inner drive and personal beliefs (n=4.84) were rated the most influential, while pre-service
education was rated the least influential (2.69). Teachers were given the option of responding “not applicable” if a
specific factor did not apply to them. Those data were then removed from our calculations, in effect reducing the number
of respondents for that particular factor. For example, note that pre-service education was rated as ‘not applicable’ by
nine participants (see Table 1). This may be due to the fact that many of our participants completed their teacher
education programs prior to the integration of technology into the college classroom.
144
A paired-samples t-test was conducted to determine if the difference between teachers’ ratings of the
influence of intrinsic and extrinsic factors was significant. The mean rating for intrinsic factors (M=4.51, SD =0.31)
was significantly higher [t(24) = 7.23, p > .001)] than the mean rating for extrinsic factors (M=3.86, SD = 0.51),
suggesting that teachers perceived intrinsic factors to be relatively more influential than extrinsic factors in their
ability to become successful technology-using teachers. This is supported by teachers’ responses to the open-ended
survey items. When asked what most influenced their use of technology, the majority of teachers described how they
were committed to using technology because it increased their ability to enhance student learning. One teacher
wrote, “Seeing my students succeed when using it. The more success they had, the more I wanted to use it.”
Another teacher indicated that the most influential factor in using technology was, “the desire to engage students as
active learners and the belief that technology is the tool to achieve that desire.”
Pearson product correlation coefficients indicated no significant relationships between 1) teachers’ levels of
computer proficiency or 2) the number of credit hours earned after a bachelor’s degree and the perceived importance
of specific internal or external factors. However, years of teaching experience was significantly correlated, at the .05
level, with teachers’ perceptions of the importance of professional development (r=.43), commitment to using
technology (r=.47), and the influence of previous success (r=.41). In other words, the longer teachers had been
teaching, the more important these enablers were perceived to be (see Table 2). In addition, females tended to rate
personal beliefs as significantly more influential than did males (r=.59). In addition, females rated technology
support (r=.49) and access to hardware (r=.40) as more influential to their success than males did.
An independent-samples t-test indicated that teachers with more experience (years > 13) rated intrinsic
factors as being significantly more influential (p = .016) than did teachers with less experience (years ≤ 13).
Experienced teachers (n=13) rated intrinsic factors as “extremely” influential (M=4.65), while less experienced
teachers (n=12) rated them as “moderately” influential (M=4.36). While teachers with more experience also rated
extrinsic factors (M=4.05) as more influential than did teachers with less experience (M=3.67), the difference was
not significant (p=.059). In general, teachers with more experience rated more factors as being moderately or extremely
influential. For example, every teacher in the more experienced category rated “commitment to using computers to
enhance student learning” as being extremely influential (M=5), while teachers with less experience rated it as
moderately influential (M=4.5).
Discussion
The results from this study suggest that the factors that most strongly affect teachers’ ability to be effective
technology users are intrinsic factors such as confidence and commitment, as opposed to extrinsic factors such as
resources and time. That is, even when resources and time are limited, exemplary teachers achieve effective use, quite
possibly due to their strong beliefs, personal visions, and commitment to using technology. As noted by Zhao and Frank
(2003), “… most factors do not directly influence technology uses in a linear fashion; rather, their influence is mediated
or filtered by teachers’ perceptions” (p. 817). This is also similar to what Becker (1994) and Hadley and Sheingold
(1993) reported: that is, the exemplary teachers in their studies described problems with resources as being less
145
severe than did other teachers. Perhaps because of their confidence, or previous successes with technology,
exemplary technology-using teachers are able to devise more ways to overcome obstacles. Based on previous
literature (Ertmer, 1999; Ertmer et al., 1999; Marcinkiewicz, 1993; Sheingold & Hadley, 1990), as well as the results of
this study, intrinsic belief systems appear to be a strong, if not the primary, contributing factor in teachers’ efforts to use
technology. This suggests the importance of providing teachers with opportunities to share their stories of successful
technology integration with their peers and to reflect on their own beliefs within a supportive and collaborative
environment (Sandholtz, Ringstaff, & Dwyer, 1997; Zhao & Frank, 2003). In this way it is anticipated that all teachers,
including those who are faced with a limited amount of resources, can find ways to use their resources to improve
student learning based on their strong personal commitments and pedagogical beliefs about the power of technology to
enhance learning.
In general, teachers in this study rated intrinsic factors as significantly more influential than extrinsic factors on
their decisions to use technology. This was supported by teachers’ responses to the open-ended survey question in which
they described the most influential factor as being their strong commitment to helping students learn. This result is
similar to that described by Ertmer et al. (1999; 2001) and others (Dexter, Anderson, & Becker, 1999; Sheingold &
Hadley, 1990). While novice technology users may base initial adoption decisions on their own goals and needs, as noted
by Zhao and Frank (2003), more accomplished users appear to focus more on their students’ needs, especially when
making classroom implementation decisions (DuFour, 2000). The results of this study suggest that, as teachers progress
from novice to accomplished users, it may be beneficial to provide opportunities for them to observe the direct impacts
on student learning obtained by more accomplished users.
Teachers with more experience tended to rate more factors as being highly influential than teachers with less
experience. In addition, results from this study suggest that the longer one has been teaching, the more important
professional development, commitment to improving student learning, and previous successes are perceived to be to
one’s current technology success. These results may be explained by the fact that teachers who entered the teaching
profession prior to the integration of technology into teacher education programs are more likely to be self-taught
(Hadley & Sheingold, 1993). It is likely that the majority of these teachers learned their skills through their own
initiative and on their own time, attending professional development workshops to learn new skills and slowly
gaining confidence as they gradually achieved more success. This finding is supported by the rating given by more
experienced teachers to the factor of personal commitment (M=5), which may be due to the time and effort they had
previously invested to effectively integrate technology, as well as their ongoing commitment to remain current with
technological advances. This finding is further supported by the lack of perceived influence that pre-service
education had on exemplary use (M=2.69).
One of the largest differences between these groups of teachers was the relatively higher influence that
technology support was perceived to have on teachers with more experience (M=4.0; M=3.1, respectively). Similar
to the results described earlier, this could be attributed to the fact that teachers who had been teaching longer
required more support due to having had less formal training with technology. Teachers with fewer years of
teaching experience may not have needed as much technology support.
Finally, all the teachers in this study rated professional development as one of the more influential extrinsic
factors (M=4.44). As noted earlier, for teachers who entered the teaching profession prior to the introduction of
technology into pre-service teacher education, professional development may provide the most accessible and affordable
means to develop these skills. Even for newer teachers who had technology training in their teacher education programs,
professional development enables them to continue to update and refine their skills. Furthermore, after having gained a
better handle on classroom management and curricular needs, newer teachers are also in a better position to learn how to
apply these skills during professional development programs.
Limitations
Results of this study are limited by the small sample size and the use of five different technology-award
programs to identify our participants. While award criteria were similar, this may have biased our sample by eliminating
additional potential participants. Future research should draw from a larger sample from the national population, in order
to increase the generalizability of the results. In addition, survey results would be better understood if follow-up
interviews or observations had been conducted. Finally, the instrument was not as reliable as the research team would
have liked it to be; a more reliable instrument would enhance the validity of the study.
Conclusion / Implications
The overarching purpose of this study was to identify effective methods for preparing teachers to use
technology in an exemplary manner. When teachers were asked to indicate the professional growth opportunities in
which they preferred to participate, they responded most frequently with “workshops/seminars” (n=20; 80%) and
146
“conferences” (n=19; 76%), while “group training with a technology coordinator/aide” was rated the lowest (n=6; 24%).
In order to increase the number of exemplary technology-using teachers in our schools, we might consider encouraging
interested teachers to collaborate with other exemplary technology-using teachers. Teachers in this study recommended
starting with a simple idea and expanding on that idea through a brainstorming session. Many teachers, especially those
who participated in this study, have already developed innovative and effective ideas for integrating technology in the
classroom. Through interaction with these exemplary models, whether through conferences, workshops, or online
mentoring opportunities, new technology-using teachers can find the kind of collegial support they desire and need.
This study highlighted teachers’ strong beliefs that exemplary technology use is founded on their own internal
beliefs and commitment, but is also supported by important extrinsic factors (professional development, technology
support) that enable them to translate that vision into practice. Educators and teacher trainers need to be aware of the
important influence that teachers’ beliefs and personal commitment have on teachers’ practice and incorporate strategies
into their professional development programs that address these beliefs and increase teachers’ commitment. Asking
teachers to share their stories and to reflect on their technology integration experiences is one potential method for
highlighting the possibilities of technology, while positively shaping their personal beliefs about those benefits.
References
Becker, H. J. (1994). How exemplary computer-using teachers differ from other teachers: Implications for realizing
the potential of computers in schools. Journal of Research on Computing in Education, 26, 291-321.
Becker, H. J. (2000). Findings from the teaching, learning, and computing survey:
Is Larry Cuban right? Education Policy Analysis Archives, 8(51). [Electronic Version]. Available:
http://epaa.asu.edu/epaa/v8n51/
Berg, S., Benz, C. R., Lasley II, T. J., & Raisch, C. D. (1998). Exemplary technology use in elementary classrooms.
Journal of Research on Computing in Education, 31, 111-22.
Bullock, D. (2004). Moving from theory to practice: An examination of the factors that preservice teachers
encounter as they attempt to gain experience teaching with technology during field placement experience.
Journal of Technology and Teacher Education, 12, 211-37.
Dexter, S. L., Anderson, R. E., & Becker, H. J. (1999). Teachers' views of computers as catalysts for changes in
their teaching practice. Journal of Research on Computing in Education, 31, 221-38.
DuFour, R. (2000). Change that counts. Journal of Staff Development, 21(4), 72-73.
Ertmer, P. (1999). Addressing first- and second-order barriers to change: Strategies for technology implementation.
Educational Technology Research and Development, 47(4), 47–61.
Ertmer, P., Addison, P., Lane, M., Ross, E., & Woods, D. (1999). Examining teachers' beliefs about the role of
technology in the elementary classroom. Journal of Research on Computing in Education, 32(1), 54-72.
Ertmer, P. A., Gopalakrishnan, S., & Ross, E. M. (2001). Technology-using teachers: Comparing perceptions of
exemplary technology use to best practice. Journal of Research on Technology in Education, 33.
[Electronic Version]. Available: http://www.iste.org/jrte/33/5/ertmer.html
Guha, S. (2003). Are We All Technically Prepared?--Teachers' Perspective on the Causes of Comfort or Discomfort
in Using Computers at Elementary Grade Teaching. Information Technology in Childhood Education, 317-
49.
Hadley, M., & Sheingold, K. (1993). Commonalties and distinctive patterns in teachers’ integration of computers.
American Journal of Education, 101(3), 261–315.
Iding, M. K., Crosby, M. E., & T. Speitel. 2002. Teachers and technology: beliefs and practices. International
Journal of Instructional Media, 29(2), 153–70.
Lumpe & A. T., & Chambers, E. (2001). Assessing teachers’ context beliefs about technology use. Journal of
Research on Technology in Education, 34(1), 93-107.
Marcinkiewicz, H. R. (1993). Computers and teachers: Factors influencing computer use in the classroom. Journal
of Research on Computing in Education, 26, 220-37.
Newby, T. J., Stepich, D. A., Lehman, J. D., & Russell, J. D. (2000). Instructional technology for teaching and
learning. Upper Saddle River, NJ: Merrill.
Russell, M., Bebell, D., O’Dwyer, L., & O’Connor, K. (2003). Teachers’ beliefs about and use of technology:
Enhancing the use of technology for new and veteran teachers. Boston, MA: Boston College, Technology
and Assessment Study Collaborative. [Electronic Version]. Available:
http://www.intasc.org/PDF/useit_r5.pdf
Sandholtz, J. H., Ringstaff, C., & Dwyer, D. C. (1990). Teaching in high tech environments: Classroom
management revisited, first-fourth year findings. Apple Classrooms of Tomorrow Research Report Number
10. [Electronic Version]. Available:
147
http://www.research.apple.com/Research/proj/acot/full/acotRpt10full.html
Sheingold, K., & Hadley, M. (1990). Accomplished teachers: Integrating computers into classroom practice. New
York: Center for Children and Technology, Bank Street College.
Wang, L., Ertmer, P., & Newby, T. (2004). Increasing Preservice Teachers’ Self-Efficacy Beliefs for Technology
Integration. Journal of Research on Technology in Education, 36(3), 231-44.
Zhao, Y., & Frank, K. A. (2003). “An Ecological Analysis of Factors Affecting Technology Use in Schools.”
American Educational Research Journal, 40(4), 807-40.
148
Impact and Perceived Value of Peer Feedback in Online Learning
Environments
Peggy A. Ertmer
Jennifer C. Richardson
Brian Belland
Denise Camin
Patrick Connolly
Glen Coulthard
Kimfong (Jason) Lei
Christopher Mong
Purdue University
Feedback has been demonstrated to play an important role in instruction (Mory, 2004, Topping, 1998) with
many learning theorists positing that feedback is essential to students’ learning (Driscoll, 2000). Current views hold
that the purpose of instructional feedback is to provide students with information they can use to confirm what they
already know or to change their existing knowledge and beliefs (Mory). Higgins, Hartley, and Skelton (2002) noted
that feedback that is meaningful, of high quality, and timely helps students become actively and cognitively engaged
in the content under study, as well as in the learning environment in which they are studying.
Compared to a traditional classroom, feedback may play an even more important role in an online
environment (Lynch, 2002; Palloff & Pratt, 2001). That is, students in an online course are more likely to disconnect
from the material or environment due to a lack of feedback than students attending a lecture-formatted course.
Instructor feedback is often cited as the catalyst for student learning in online environments, while lack of feedback
is most often cited as the reason for withdrawing from online courses (Ko & Rosen, 2001; Lynch; Palloff & Pratt).
Because of the importance of feedback in online environments, a number of recommendations have been
made for increasing its effectiveness. Notar, Wilson, and Ross (2002) specifically called for feedback that was
“diagnostic and prescriptive, formative and iterative, and involving both peers and group assessment” (p. 646).
According to these authors, feedback should focus on improving the skills needed for the construction of end
products, more than on the end products themselves. While students agree that feedback needs to contain a
formative aspect, they also desire summative comments. As Schwartz and White (cited in Mory, 2004) reported,
students expect feedback in an online environment to be: 1) prompt, timely, and thorough; 2) ongoing formative
(about online discussions) and summative (about grades); 3) constructive, supportive, and substantive; 4) specific,
objective, and individual; and 5) consistent.
Research has shown that the quality of student discussion responses can be increased through the use of
constructive feedback that is prompt, consistent, and ongoing (Ertmer & Stepich, 2004). Discussions without
guidance or feedback can be ineffective and inefficient, yet significant instructor time is required to provide
meaningful feedback on students' individual postings. Debowski (2002) noted that while online instructors often
provide learners with relevant and helpful examples of the content being studied, they are much less likely to offer
relevant and useful feedback.
One possible solution is for students to provide feedback to each other. As noted by Maor (2003), feedback
"can no longer be considered the sole responsibility of the instructor because there is a much larger focus on
dialogue…[and] the joint construction of knowledge" (p. 128). Depending upon how peer feedback is structured,
instructors could be spared from evaluating large numbers of student postings, yet still provide as many instances of
formative and summative feedback as they deem necessary. Students, on the other hand, would still receive the
feedback they require in order to assess their progress in the online environment. While “peer feedback might not be
of the high quality expected from a professional staff member, its greater immediacy, frequency, and volume
compensate for this” (Topping, 1998, p. 255).
In addition to the benefits of receiving adequate feedback, students may also benefit from giving peer
feedback. Liu, Lin, Chiu, and Yuan (2001) proposed that, when asked to offer feedback to peers, students progress
beyond the cognitive processes required for completing a given task, as they must now “read, compare, or question
ideas, suggest modifications, or even reflect on how well one’s own work is compared with others” (p. 248).
McConnell (2002) also suggested that collaborative assessment moves students away from dependence on
instructors as the only, or major, source of judgment about the quality of learning to a “more autonomous and
149
independent situation where each individual develops the experience, know-how, and skills to assess their own
learning” (p. 89). Thus, students are offered the opportunity not only to reflect on the work of their peers, but also
on their own work.
Although peer feedback can add value to the instructional process, it is not without its challenges. These
challenges relate to a wide range of issues, including implementation, students’ anxiety over giving and receiving
feedback (especially negative feedback), and reliability, to name a few. According to Palloff and Pratt (1999), “The
ability to give meaningful feedback, which helps others think about the work they have produced, is not a naturally
acquired skill” (p. 123). In terms of implementation, Topping (1998) noted that “both assessors and assessees might
experience initial anxiety about the process” (p. 256), but suggests that this may be mitigated by asking students to
provide positive feedback before providing any negative feedback. Topping also suggested that learners may
perceive peer feedback to be invalid, thus causing low-performing students to refuse to accept negative feedback as
accurate. These concerns over accuracy and validity may, in fact, be justified, based on the tendency for students to
either inflate or deflate scores (Topping, 1998).
It is unclear whether challenges related to giving and receiving peer feedback in a traditional environment
will be exacerbated or mitigated when applied within the online environment. Tunison and Noonan (2001) reported
that many students found it difficult to communicate complex ideas in an online environment, and that their ability
to express their questions clearly and comprehend detailed explanations was limited by the lack of face-to-face
interaction. Arbaugh (2000) reported that while student participation in online course discussions tends to be more
equal and at a higher level than in traditional settings, this interaction may not be as effective as face-to-face
interaction—at least not until participants achieve a level of comfort with each other. If peer feedback is to be
beneficial to all members of the learning community, these are issues that must be addressed (Preece, 2001).
According to Mory (2004), “Although there has been progress in determining ways in which feedback can
best be used under certain conditions, there are still many areas in which the feedback literature is not consistent and
yet other areas that have been left unexplored” (p. 771). For example, little work has been done that examines the
role or impact of feedback in online learning environments in which learners construct their own knowledge, based
on prior experiences and peer interactions. The purpose of this study was to examine the perceived value and impact
of peer feedback on students’ postings in an online learning environment. Specifically, the research questions
included:
1. What is the impact of peer feedback on the quality of students’ postings in an online environment? Can
quality be maintained and/or increased through the use of peer feedback?
2. How do students’ perceptions of the value of receiving peer feedback compare to the perceived value
of receiving instructor feedback?
3. What are students’ perceptions of the value of giving peer feedback?
4. What aspects of the peer feedback process do students perceive as being particularly useful or
challenging?
Methods
To determine the viability of either supplanting or supplementing formative instructor assessments with
peer feedback in an online environment, we examined the use of peer feedback during a semester-long, graduate-
level online course in the College of Education at a large Midwestern university. Using a mixed-methods approach,
data were collected through participant interviews, scored ratings of students’ weekly discussion postings, and
responses to both entry and exit survey questionnaires. Changes in scored postings were used to answer our research
question regarding the impact of peer feedback on the quality of students’ postings. Survey results captured
students’ overall perceptions of giving and receiving feedback, while interviews provided insights into individual
perceptions and personal experiences with the feedback process, in general, and the peer feedback process,
specifically.
Role of researchers
The researchers in this study included two faculty members and seven graduate students (one female/six
male) in the educational technology program in the College of Education. All had experience in online learning
environments, and all were familiar with the scoring rubric (based on Bloom’s taxonomy) used by the participants in
this study.
Participants
The participants in the study were 15 graduate students (10 female, 5 male) enrolled in an online
technology integration course during the spring semester of 2005. Eight of the participants were administrators, such
150
as technology directors or principals, and three additional students were former or current teachers. Of those
pursuing a graduate degree, five were masters and nine were doctoral students. The human subjects review board
deemed this study exempt under university guidelines.
Data collection
Researchers’ ratings of discussion postings, pre- and post-surveys, and student interviews comprised the
primary data sources. Course documents (e.g., syllabus, assignment descriptions), and students’ peer ratings of
discussion postings constituted secondary data sources.
Discussion postings. In order to assure consistency of scoring of students’ online postings, the research
team scored all discussion postings, using the same rubric students had used. While these were not the scores that
students received during the course, they provide a better indication of the changing quality of their responses. That
is, because students’ postings were rated by many different peers (each with their own interpretations of how to
apply the rubric), it was important, for research purposes, to use a more consistent measure of quality. Furthermore,
the students were not required to score each posting that a peer had made to a DQ but rather, only the two required
postings, thus making the data set incomplete.
Two researchers rated all of the student postings. In order to assure that the scoring was not influenced by
the timing of the posts (with later scores automatically receiving higher scores), all evidence of DQ numbers,
posting dates, and times was removed from these documents. To assure consistency in scoring, the two raters scored
a complete set of postings (n = 59) from a single randomly selected discussion question. Working from separate
printed copies, the raters scored the first ten postings independently and then verbally discussed their scores. After
securing agreement on the first ten postings, the raters independently scored the next ten postings. Upon completion,
the raters compared their results, tallied the number of disputed scores, and then discussed their differences. The
raters proceeded with this process until all 59 postings were completed. The final results showed 86.44% agreement
between the two raters. Following this, the two researchers divided and independently rated the remaining sixteen
discussion questions, containing anywhere from 38 to 81 postings each.
Pre- and post-surveys. At the end of week 5, students completed a survey (13 Likert-style items; 5 open-
ended questions) in which they rated their level of agreement (from 1-strongly disagree, to 5-strongly agree) on the
importance of various aspects of feedback (e.g., timeliness, quality, quantity) and the extent to which the feedback
they had received, from the instructor, met these criteria. Students described their typical responses to receiving
151
positive and negative feedback (e.g., “When I receive feedback that is below my expectations, I tend to ...” and “The
feedback in this course, has changed my postings in the following ways …”) and their ideas regarding the most
effective feedback methods in an online course. The initial survey served as a pre-measure of students’ perceptions,
as students completed it prior to giving or receiving peer feedback. In week 16, students completed a post-survey in
which they rated the importance of peer and instructor feedback and commented on the value of both giving and
receiving peer feedback. Additional survey items were used to triangulate results from the student interviews.
Interviews. Participant interviews were conducted in order to obtain more detail about individual issues
arising from the peer feedback process (e.g., “How easy or hard is it to use Bloom’s taxonomy as a scoring rubric?”
“How do you feel about peers evaluating your postings?”) Each member of the research team interviewed two
participants via telephone or in person. The interviews lasted 20 to 30 minutes, were recorded electronically, and
then transcribed. Once completed, the interview transcriptions were sent to the participants for member-checking to
ensure accuracy and completeness.
Data analysis
In order to determine the impact of peer feedback on the quality of students’ postings, we compared the
average scores obtained on postings prior to the use of peer feedback (weeks 3-5) to those obtained during the peer
feedback process (weeks 7-13), using a paired sample t-test. T-tests were also used to compare students’ ratings, on
the pre and post survey, of the value of peer and instructor feedback. These results were then triangulated with
ratings collected during participant interviews, conducted several weeks after the peer feedback process had started.
Participants’ perceptions of the value of the process were compared across open-ended survey questions and
interview responses. After selecting a set of standardized analysis codes, NUD*IST qualitative analysis software
helped identify recurring themes and patterns across the interview data.
Validity concerns were addressed through the triangulation of data sources, member-checking of the
transcribed interviews, and pattern-matching through coding and discussion with other members in the research
team. The use of a standardized interview protocol served to increase reliability, as did having previous experiences
with Bloom’s taxonomy. The use of multiple interviewers and evaluators helped eliminate interviewer biases.
Results
Perceived value and impact of peer feedback
At the beginning of the course, students believed that feedback in an online course was “slightly more
important” than in a traditional course (M=3.6/5.0) and thought that feedback should be timely (M=3.8) and of high
quality (M=3.9). Students considered the quantity of feedback to be less important (M=3.3) than quality. By the end
of the course, students’ perceptions of the importance of feedback in an online course had significantly increased
(M=4.7; t[11]=2.24; p=.05), as had their expectations that feedback should be timely (M=4.3; t[11]=3.32; p=.007).
(Note: Only 12/15 pre-surveys were returned.)
A paired t-test indicated no significant difference (t[14]=.29; p=.77) in the quality of students’ postings on
discussion questions in which they received instructor feedback (weeks 3-5, M=1.31) compared to those on which
they received peer feedback (weeks 7-13; M=1.33). Thus, although the quality of students’ postings did not improve
with peer feedback, neither did it decrease, suggesting that peer feedback may be effective in maintaining quality of
postings, once a particular level has been reached.
While specific changes in the quality of postings was not evident as a result of peer feedback, interview
comments suggested that students (n=8) used information obtained from the feedback process to improve the quality
of their postings.
Yes, it has impacted on my own posts. Because I remember the first time I got feedback [it said] "it is
important to give an example." And so I try to put more examples in my answers.
Somebody scored me on a 2, and one gave me a 1 because they didn’t think I got to the higher levels of
Bloom’s taxonomy; one did, one didn’t. You know, you sit down and you say, “Well maybe there’s
something I need to improve in how I write my answers so they could clearly see that I’m hitting that, so I
now throw in words like, “In evaluating this concept, I believe…” I tend to use clearer terms to help them
identify where I believe my thinking process is.
152
important (M=4.3) than peer feedback (M=3.3). In general, students disagreed with the statement that they would
rather receive feedback from their peers than from the instructor (M=2.0). They explained that the instructor was
more knowledgeable and thus, should oversee scores that peers provide. By the end of the semester, students’
perceptions of the value of instructor feedback (M=4.6) did not significantly change; furthermore, it was still
perceived as being more important than peer feedback (M=3.7). A paired t-test [t(11) = 3.19] showed this difference,
between the perceived values of instructor and peer feedback, to be significant at the .009 level. Interview comments
provided additional insights into reasons why students preferred instructor feedback. For example, students
expressed concerns about potential biases in peers’ evaluations due to the fact that it was required (n=3), that not
everyone was motivated to provide quality feedback (n=5), or that it took a great deal of time to give quality
feedback (n=4). One student noted:
The feedback was kind of superficial. You just kind of go through the motions—at least the stuff I’ve
gotten back. There’s not really any real substance to it. If the person did not score at the highest level,
[peers should] identify something that would take them to the next level or the highest level.
Additional comments, while still describing benefits to peer feedback, point to the previous experiences,
unbiased approach, and general expertise of the instructor:
… It is good to know everybody else’s opinion. [And] I guess it can help you [move] to some other
directions that might lead you to some more questions, but overall, it is not really going to change my
perspective on the question.
I like the peer feedback better, in the sense of how it makes me feel. But as far as valuing what they're
saying about me, I would value [instructor's] feedback more. Her grading was a little harder than what my
peers has been, but it was probably more on target.
As noted above, even though students preferred instructor feedback, the majority of them (n=13) still
valued the peer feedback process and many described important aspects of the process (e.g., anonymous format;
relative weight given to it). As noted by one student:
This experience is more in-depth, and I would have to say, more positive [than in other courses], because if
peer feedback is the sole source of feedback that we are getting [it] … has to be more thorough and more
comprehensive. Previous peer feedback experiences I've had were coupled with feedback from the
instructor, and were seen more as a secondary measure. In this instance, as a primary measure, it has been a
lot more valuable.
Additional benefits to receiving peer feedback included receiving confirmation that their ideas were
meaningful to others as well as having opportunities to profit from the insights of their peers, who could offer a
variety of perspectives that the instructor could not provide.
It’s nice to get some validation that what you had to say was important to somebody else, that they got
something from it.
My impressions are that it is very beneficial to learning in that peers often have different perspectives than
the instructor, and there are significantly more of them, and they can provide a lot of insight and ideas that
the instructor might not have noticed. Peers are more often on the same level and may be able to explain
things in a manner that makes more sense than the instructor might have.
153
people who maybe don’t have [the same experience], we’re actually reinforcing our own learning much
more strongly. So we’re gaining.
However, as with receiving peer feedback, students perceived difficulties with the process. The main
concerns for giving feedback related to being consistent and fair (n=4). For example, one student commented, “I
think peer feedback is good, but in some respects, I don’t know if I’m really qualified to give a grade to anybody.”
Particularly worrisome to some students was having to give a 0-score. In fact, some students simply would not do
this.I am not sure if I could give a 0 to anyone because I don't feel that I have the power to say, "That's not a good
idea."
Even though I don’t know them, I don’t think I’d give them a 0, no.
This is supported by the peer feedback data; in approximately 160 peer-rated postings, peers gave a 0-score
only 7 times (4%). Still, a few students (n = 4) indicated that the issue was not one of assigning a low score but of
being a conscientious educator. These students believed that a low score provided a teachable moment, providing
the opportunity to offer constructive criticism and suggestions for improvement. Overall, the majority of students (n
= 8) felt that the benefits of providing peer feedback outweighed the costs. While specific benefits related to
learning how to improve the quality of their own posts as well as their feedback to others, the main cost related to
the time needed to do a good job. Still, students described the time commitment as appropriate to a graduate course,
as well as relevant to their future careers, as noted by one student: “Skills associated with peer evaluation are going
to carry on much longer than the course.”
Summary
Though participants’ perceptions of the importance of feedback in an online course significantly increased
from the beginning to the end of the course, students continued to believe that instructor feedback was more
important than peer feedback. Furthermore, despite seeing no quantitative improvement in the quality of students’
postings during the peer feedback process, interview data suggested that participants valued the peer feedback
process and benefited from having to give and having received peer feedback.
Discussion
Value and impact of feedback in an online environment
Results from this study highlight the importance of feedback in an online environment and support the
assumption that students’ postings can reach, and be sustained at, a high level of quality through a combination of
instructor and peer feedback. In general, students’ postings, across 17 discussion questions, averaged 1.32 on a 2-
154
point “quality” scale. While we expected that the quality of students’ postings might gradually improve over the
semester, as was demonstrated in a similar study by Ertmer and Stepich (2004), our results showed no significant
improvement in students’ postings from the beginning to the end of the course. We suspect that a number of factors
may have mediated students’ efforts to achieve high quality postings. First, the online course was structured such
that students were required to submit two postings (for grading) each week: an “initial” post to the weekly
discussion question, as well as one response to another student. Additional postings were not required, nor did
students expect them to be scored for quality. Therefore, once the initial and follow-up postings were made in a
specific forum, students had little motivation to strive for high quality with any additional postings. Furthermore,
scoring postings with a grading rubric that allowed for only two meaningful levels of quality may not have provided
enough room for growth, thus causing a ceiling effect to occur. Since students started out with relatively high scores
on their two required posts, there was little opportunity to demonstrate improvement in these scores during the
semester. In the future, it might be important to include a scoring rubric that allowed for more variation among
scores. The disadvantage to this, however, is that as the scale becomes more finely gradated, it becomes increasingly
difficult to differentiate among the various levels of quality.
Another reason students may not have demonstrated increased quality in their postings relates to the
discussion starters used. In this course, many of the discussion starters, especially those developed by student
leaders, were not particularly conducive to high-level responses. For example, student leaders tended to ask their
peers to provide examples of current issues they faced in their classrooms or schools (e.g. how to integrate
technology, how to cope with security issues, how to apply distance learning opportunities in the classroom). While
these types of discussions might be expected to stimulate responses related to the application level on Bloom’s
taxonomy (score = 1 point), they would not readily engender responses related to analysis, synthesis, or evaluation
(score = 2 points). As Black (2005) noted, "most online discussion consists of sharing and comparing information,
with little evidence of critical analysis or higher order thinking. Such findings serve to remind us that it is not the
technology itself but the manner in which it is applied that is most critical” (p. 19). Thus, it is important for
instructors to not only facilitate meaningful online discussions but also to be cognizant of the development of
discussion questions in such a way that allows students to attain higher-order thinking.
Communication in online courses serves many functions, only some of which are specifically content-
focused (Ko & Rosen, 2001; Palloff & Pratt, 1999, 2001). However, in this study, we rated every response posted in
17 different discussion forums, including responses that were intended solely for interpersonal or motivational
purposes. While these types of postings serve important roles, they would not be likely to receive a high-quality
score, based on Bloom’s taxonomy. Given this, we considered scoring only the required posts in each forum;
however, it was difficult to determine, post-hoc, which postings students intended to “count” as their required two
postings. Additionally, this would have reduced the total number of analyzed postings from 778 to 160, which
would have greatly limited our ability to measure changes in posting quality. In the future, it will be important to
clarify exactly how many postings will be scored in a discussion forum while also leaving room for students to make
additional postings that serve to build a sense of community and trust.
155
One of the potential advantages to using peer feedback, as noted by Topping (1998), is the increased
timeliness in receiving feedback. However, in this study, students’ feedback was channeled through the instructor,
thus causing a delay in delivery time—initially taking as long as two weeks. The significantly higher rating, at the
end of the course, of the importance of timeliness of feedback may have been in reaction to the perceived delay in
receiving peer feedback. This lag time, then, may have cancelled out one of the proposed benefits of peer feedback,
that is, increasing the timeliness of receiving feedback.
Still, despite these logistical problems, the majority of students indicated that peer feedback positively
impacted the quality of their discussion postings. They described a number of specific benefits from receiving peer
feedback including recognition of their ideas, access to multiple perspectives, and receiving a greater quantity of
feedback than would have been received from the instructor alone. Students also noted positive aspects of the peer
feedback process, including the ability to provide anonymous feedback and the ability to receive a grade that
reflected the average score given by two different peers.
In addition to impacting the quality of their discussion postings, students also described how peer feedback
helped them improve the quality of the feedback they, in turn, provided to others. In other words, after receiving
initial peer feedback, some students realized they had not been as in-depth or constructive as they could have been in
providing feedback to others and thus improved the quality of their own feedback. Ko and Rosen (2001) noted that
the ability to “cross-check” one’s understanding is an essential step in the learning process.
156
feedback, and that it be delivered in a timely manner. Although the survey results indicated that student ideas about
the value of peer and instructor feedback did not change over the course of the semester, interview comments helped
us determine where the specific strengths and weaknesses of the feedback process occurred. While many of the
strengths seemed to be related to the inherent value of participating in the feedback process (e.g., reflection during
the feedback process, improving posts and feedback), weaknesses seemed to be associated, at least to some extent,
with the logistics of the process (e.g., time delay from providing feedback to receiving feedback). Perhaps if
instructors can simplify the logistics involved in giving and receiving peer feedback, and can somehow assure the
importance and validity of peers’ responses, students will be able to appreciate and accrue the potential benefits.
Furthermore, if the use of peer feedback can decrease an instructor’s workload in an online course while continuing
to maintain a high quality of postings, this may offer a viable alternative, or at least a reasonable supplement, to
facilitating learning in an online course. That is, by addressing these logistical issues, it may be possible to increase
both the efficiency and effectiveness of the process, as well as the perceived value for the participants. As
summarized by one student:
I think that if it were developed a little more, I think it would be really effective. It seemed kind of OK I
think right now it’s of value to the person evaluating, but I don’t really think it’s much of a value to the
person receiving it. It’s kind of like, “Ok great.” But I think that maybe if it wasn’t every week, and
maybe in a different format than these discussion postings, the peer evaluation would work great. … That’s
my opinion. I think it’s a good beginning, but I think it could be built much more.
References
Arbaugh, J. B. (2000). How classroom environment and student engagement affect learning in Internet-based MBA
courses. Business Communication Quarterly, 63(4), 9-26.
Black, A. (2005). The use of asynchronous discussion: Creating a text of talk. Contemporary Issues in Technology
and Teacher Education [Online serial], 5(1). Retrieved October 3, 2005, from
http://www.citejournal.org/vol5/iss1/languagearts/article1.cfm
Debowski, S. (2002). Modeling and feedback: Providing constructive guidance through a web medium. Retrieved
February 20, 2005, from http://lsn.curtin.edu.au/tlf/tlf2002/debowski.html
Driscoll, M. (2000). Psychology of learning for instruction (2nd ed.). Boston: Allyn and Bacon.
Dunlap, J. C. (2005). Workload reduction in online courses: Getting some shuteye. Performance and Improvement,
44(5), 18-25.
Ertmer, P. A., & Stepich, D. A. (2004). Examining the relationship between higher-order learning and students’
perceived sense of community in an online learning environment. Proceedings of the10th Australian World
Wide Web conference, Gold Coast, Australia.
Garrison, D.R., Anderson, T., & Archer, W. (2000). Critical inquiry in a text-based environment: computer
conferencing in higher education. The Internet and Higher Education, 2(2-3), 87-105.
Henderson, T., Rada, R., & Chen, C. (1997). Quality management of student-student evaluations. Journal of
Educational Computing Research, 17, 199-215.
Higgins, R., Hartley, P., & Skelton, A. (2002). The conscientious consumer: Reconsidering the role of assessment
feedback in student learning. Studies in Higher Education, 27(1), 53-64.
Juwah, C. (2003). Using peer assessment to develop skills and capabilities [Electronic version]. United State
Distance Learning Association Journal, 17 (1), Article 04. Retrieved January 31, 2005, from:
http://www.usdla.org/journal/JAN03_Issue/article04.html
King, A., Staffieri, A., & Adelgais, A. (1998) Mutual peer tutoring: Effects of structuring tutorial interaction to
scaffold peer learning. Journal of Educational Psychology, 90, 134-152.
Ko, S., & Rossen, S. (2001). Teaching online: A practical guide. Boston: Houghton-Mifflin.
Liu, E. Z., Lin, S.S., Chiu, C., & Yuan, S. (2001). Web-based peer review: The learner as both adapter and reviewer.
IEEE Transactions on Education. 44, 246-251.
Lynch, M. M. (2002). The online educator: A guide to creating the virtual classroom. New York: Routledge.
Maor, D. (2003). The teacher’s role in developing interaction and reflection in an online learning community.
Education Media International, 40, 127-137.
McConnell, D. (2002). The experience of collaborative assessment in e-learning. Studies in Continuing Education,
24(1), 73-92.
Mory, E. H. (2004). Feedback research revisited. In D. H. Jonassen (Ed.), Handbook of research on educational
communications and technology (pp. 745-783). Mahwah, NJ: Lawrence Erlbaum.
Notar, C. E., Wilson, J. D., & Ross, K. G. (2002). Distant learning for the development of higher-level cognitive
skills. Education, 122, 642-650.
157
Palloff, R. M., & Pratt, K. (1999). Building learning communities in cyberspace. San Francisco: Jossey-Bass.
Palloff R. M., & Pratt, K. (2001). Lessons from the cyberspace classroom: The realities of online teaching. San
Francisco: Jossey-Bass.
Preece, J. (2001). Online communities: Designing usability, supporting sociability. New York: Wiley.
Topping, K. (1998). Peer assessment between students in colleges and universities. Review of Educational Research,
68, 249-276.
158
159
Online Follow-up to Professional Development for Texas School Librarians:
The Value of a Collaborative Learning Environment
Marybeth Green
University of Texas at San Antonio
Lauren Cifuentes
Texas A&M University
Abstract
The purpose of this study was to examine the effects of online follow-up and collaboration on attitudes
towards the professional development program and course completion of an online follow-up course added to face-
to-face professional development for librarians in 12 Texas school districts. This study used a posttest-only control
group experimental design with self-selected participants.
At the beginning of the 2004-2005 school year, school librarians participated in a face-to-face workshop
during inservice training. The workshop dealt with the process of creating a TAKS Support Plan, a plan for the
library to support weaknesses on the TAKS at their school. At the conclusion of the workshop, school librarians
were given the opportunity to participate in an eight-week online follow-up course that supported implementation of
inservice themes. School librarians were stratified by level of service and socioeconomic school status and were
randomly assigned to one of three environments. Two experimental environments were used: (a) Collaborative
Follow-up and (b) Noncollaborative Follow-up and a control environment, No Collaboration/No Follow-up. The
experimental environments were given additional information and support in an online course to aid the creation of
their TAKS Support Plan.
Results indicate that inclusion of online collaboration and follow-up resulted in more positive attitudes
toward the professional development than the professional development with no collaboration or follow-up.
Logistic regression revealed that the likelihood of completion could be predicted by membership in professional
development condition. The likelihood of completion by participants in the Collaborative Follow-up condition was
significantly greater than participants in the NonCollaborative Follow-up and No Collaboration/No Follow-up
conditions. No difference was found between the completion rates of the NonCollaborative Follow-up and No
Collaboration/No Follow-up conditions.
Introduction
Follow-up to face-to-face professional development has long been established as essential to sustaining
educator change. Initial enthusiasm for content presented in a professional development workshop may be
reassuring to organizers, but has relatively little influence on educator learning The need for follow-up to
professional development has been summarized as: “Without continuing encouragement and support [upon
completion of workshops and courses], the average educator has a remarkable capacity for reverting back to old
practices under a new name (Beeby (1980), p.466.”
The ultimate goal for educator professional development is educator learning that promotes changes in the
educators’ knowledge, understanding, behaviors, skills, values, and beliefs. The process of implementing change,
however, can be very threatening - challenging educators’ accepted pedagogical beliefs and philosophies, requiring
educators to adopt and use new practices, and exchanging familiar materials and resources with those that are
foreign. Through well-structured follow-up over time, educators are given opportunities to grapple with change, to
engage in discussions regarding beliefs and assumptions about issues related to practice, to build competence in new
tasks or strategies and to try new roles and create new structures in safe conditions.
Follow-up conditions founded on collaborative learning philosophies enhance educators’ capacities to
adapt to and implement change. Collaboration removes feelings of educator isolation and the sense that educator
learning is solely an individual responsibility. In collaborative learning conditions, multiple opportunities are
provided for discourse so that educators can learn though and from each other in a learning community. Discourse
becomes a tool for reflecting critically on practice and on the impact educators have on students’ learning.
The growth of powerful network and communication technologies in schools creates new opportunities for
follow-up and collaboration to support professional development. These technologies enable online professional
development with a high degree of communication and interactivity among educators spread across vast geographic
160
landscapes. However, there is enormous diversity in instructional design among the various offerings. At one end of
the scale are courses offering rich opportunities for interaction with content, with other students, and the instructor
while at the other end are those offering only interaction with the content.
Therefore, the researchers asked the questions:
Is there a significant difference between professional development environments that include online Collaborative
Follow-up, online NonCollaborative Follow-up and No Collaboration/No Follow-up in attitudes towards the
professional development program? “
Does the likelihood of course completion of a follow-up course to a face-to-face workshop for school librarians
differ among online professional development conditions including Collaborative Follow-up, NonCollaborative
Follow-up and No Collaboration/No Follow-up?
Methododology
Participants
Participants were drawn from the population of school librarians in 12 school districts in Texas. These
districts represented several of the largest districts in Texas whose library services directors have been active in
Texas Library Association. This created a population of 812 school librarians. School librarians’ experience ranged
from school librarians in their first year of practice to school librarians with thirty-plus years of service. School
librarians’ level of service ranged across elementary, middle and high school representing the distribution in the
field.
Procedure
At the beginning of the 2004-2005 school year, school librarians in 12 K-12 school districts across the state
of Texas participated in a face-to-face workshop in their district presented by the Library Services Directors or
his/her designee. The workshop focused on training librarians to create a plan for the Library to support weaknesses
on the Texas Assessment of Knowledge and Skills (TAKS), the state standardized test, at their school. One of the
researchers trained the Library Services Directors or their designee during the summer so that all workshops were
consistent. Each presenter used an agenda and a PowerPoint presentation developed by one of the researchers.
During the workshop, school librarians were trained on:
• the need for a TAKS Support Plan,
• how to obtain and read the report of student weaknesses at their school provided by the Texas Education
Agency,
• the components of the plan and
• resources available to help them complete the plan.
At the end of the face-to-face workshop, school librarians were offered the opportunity to continue working
on creating a TAKS Support Plan through an online follow-up course sponsored by the study. The online follow-up
course was divided into six modules each requiring approximately one hour of time online each week.
A total of 444 school librarians indicated an interest in participating in the online course. These participants
were stratified by level of service (elementary, middle, high) and by socioeconomic status of the school. They were
then randomly assigned to one of three online conditions: Collaborative follow-up, NonCollaborative follow-up and
No Collaboration/No Follow-up. They were enrolled in an online continuing education course matching their
treatment condition at the Continuing Education division of Instructional Technology Services at Texas A&M.
WebCT Vista was the course management software used to deliver the courses.
Of the 444 who indicated an interest, 278 actually entered the course. The Collaborative Follow-up
environment had 94 participants. The Noncollaborative Follow-up environment had 96 participants. The No
Collaboration/No Follow-up environment had 88 participants
Treatment Environments
All follow-up learning took place in separate WebCT Vista courses that were configured to support the
three conditions. In those conditions which received follow-up support, course modules were released weekly using
the selective release tool within WebCT Vista. Table 1 illustrates the attributes of the various treatment conditions.
Discussion questions in the Collaborative Follow-up environment and journal questions in the NonCollaborative
condition were the same.
161
Table 1.Treatment Conditions for the Creating a TAKS Support Plan Online Professional Development
NonCollaborative No Collaboration/
Collaborative Follow-up Follow-up No Follow-up
Participate in
initial workshop
Logon to WebCT
Vista
Upload TAKS
Support Plan
through WebCT
Vista
Cueing messages
Follow-up Readings
Weekly module questions
completed independently
and submitted to instructor
through Assignment tool
Email to course instructor
Collaborative Readings In a
Follow-up Online discussion of typical
weekly module questions week in
with peers the
Email to peers or course collaborati
instructor ve
View and discuss peers’ environme
TAKS Support Plans nt, a
Online Chat with peers school
librarian in this environment might log on and take part in the following activities online:
• Check for announcements
• Read the objectives for the week
• Read feedback from instructor on previous week’s TAKS Plan section submission
• Read the journal articles and PowerPoints chosen to extend understanding and support writing weekly
TAKS Support plan assignment.
• Read email.
• Read and participate in the weekly discussion.
• Participate in a chat.
In a typical week, a school librarian in the Noncollaborative environment might log on and take part in the following
online activities:
• Check for course announcements
• Read the objectives for the week
• Read feedback from instructor on previous week’s TAKS Plan section submission.
• Read the journal articles and PowerPoints chosen to extend understanding and support writing weekly
TAKS Support Plan assignment.
• Read weekly cueing messages in the form of announcements and messages.
Instrumentation
Attitudes towards the Professional Development Program, an instrument developed by the researcher,
measured course satisfaction in five categories drawn from Guskey’s (2000) professional development evaluation
model: (a) participant reactions, (b) participant learning, (c) participant’s use of new skills, (d) organizational
culture, and (e) student outcomes. “Participant reactions” was intended to assess whether participants felt that the
program was well organized, that time was well spent, and if school librarians felt activities were useful.
“Participant learning” was intended to assess how well the participants felt they had learned the concepts, ideas
162
and/or pedagogies included in the professional development program. “Participants’ use of new skills” was intended
to assess the extent to which participants planned to implement the new the concepts, ideas, and/or pedagogies in the
professional development program in their educational situation. This survey depended on participants’ self report
on the implementation items. “Organizational culture” assessed the participants’ perception of support by their
school for their plan. “Student outcomes” measured the extent to which librarians believed that their TAKS Support
Plan would impact student performance on the TAKS. The survey included 15 items regarding participation in the
overall professional development program. This survey used a 5-point Likert scale to indicate the degree to which
they agree or disagreed with the item. Higher scores correspond with a positive response. Mean survey responses
ranged from 1 to 5. In reporting scores, mean ratings of 1.0-2. 0 were classified as very negative, 2.01.-2.99 were
classified as mildly negative, 3.0 were classified as neutral, 3.01-4.0 were classified as mildly positive, and 4.01 to
5.0 were classified as very positive. A coefficient alpha was generated to determine the relationship between
individual test items and the test as a whole. The coefficient for the total test was .92.
Course completion was measured by completion of all six parts of the TAKS Support Plan. Plans that met this
criteria were given a 1 and plans that were not completed were given 0.
Results
The following section presents the results of the statistical analysis on attitudes towards the professional
development program and completion.
Attitudes
Overall, all groups reported mildly positive attitudes with 3.71. Highest mean ratings were reported by
participants in the Collaborative Follow-up environment with 3.94. Participants in the NonCollaborative Follow-up
environment had the next highest ratings with 3.77. The participants in the No Collaboration/No Follow-up
environment had the least positive ratings with 3.3.1. Means and standard deviations for the Attitudes Towards the
Professional Development Survey Items overall and by environment are presented in Table 2.
One-way analysis of variance (ANOVA) was conducted to determine if there were significant differences
in attitudes towards the professional development program among the environments.Ttype of professional
development environment (Collaborative Follow-up, NonCollaborative Follow-up and No Collaboration/No
Follow-up) was used as the independent variable and mean scores from the Attitudes towards the Professional
Development Program Survey was used as the dependent variable. Results from the ANOVA are presented in Table
3.A statistically significant difference was found among the three types of professional development environment on
attitudes towards the professional development program, F(2,203) = 10.098, p = .000 .
Table 2. Mean and Standard Deviations for Attitudes Towards the Professional Development Survey Items Overall
and By Environment
Collaborative NonCollaborative No Collaboration/
Overall Follow-up Follow-up No Follow-up
Mean SD Mean SD Mean SD Mean SD
3.71 .77 3.94 .68 3.77 .68 3.36 .84
Table 3. Results of the ANOVA on Attitudes towards the Professional Development Program
Source df SS MS F p
Attitudes
Between groups 2 1.199 .599 10.098 .001
Within groups 201 11.932 .059
Total 203 13.131
Post hoc Tukey HSD Tests indicated that the attitudes of the Collaborative Follow-up participants differed
significantly from the No Collaboration/No Follow-up, (p < .001). Likewise, a significant difference was found
between the attitudes of the NonCollaborative Follow-up and the No Collaboration/No Follow-up (p < .007).
163
Completion
Table 4 presents the completion rate by online professional development environment. Binary Logistic
regression was used to estimate the likelihood of course completion by membership in professional development
condition (see Table 3). Differences in course completion were significantly predicted χ2 = 14.474, df = 2, p < .001.
Table 2 presents the frequencies and percentages of completion by environment and Table 3 presents the results of
the logistic regression with No Collaboration/No Follow-up as the referent condition. Membership in the
Collaborative Follow-up condition was significantly associated with greater likelihood of course completion when
No Collaboration/No Follow-up participants were the referent group (OR 3.186). Librarians in the Collaborative
Follow-up condition were three times as likely to complete as librarians in the No Collaboration/No Follow-up.
There was no significant difference in the likelihood of completion between the participants in the
NonCollaborative Follow-up condition and the participants in the No Follow-up/No Collaboration condition
A second logistical regression was conducted using a contrast variable to determine whether there was a
difference between the Collaborative Follow-up and the NonCollaborative Follow-up conditions in predicting the
likelihood of course completion. Participants in the Collaborative Follow-up conditions were significantly more
likely to complete than participants in the NonCollaborative Follow-up condition with an Odds Ratio of .544, p<
.05. Librarians in the NonCollaborative Follow-up condition were 45% less likely to complete as librarians in the
Collaborative Follow-up condition.
Discussion
School librarians whose environment included follow-up reported attitudes that were significantly more
positive than the school librarians whose environment did not include follow-up. This finding supports previous
theory and research that asserts that educators learn best when professional development learning is sustained over
time through follow-up (Garet et al., 2001; Showers et al., 1987). Traditional professional development programs
based on standalone workshops are not as well received by educators as professional development programs that
include follow-up to support the ongoing process of educator change. Such follow-up enhances educators’ feelings
of competence (Guskey, 2000). Educators value professional development that enhances their effectiveness with
students (Fullan & Stiegelbauer, 1991). Professional development programs that result in educators developing the
knowledge and skills that improve student outcomes are rated favorably by educators (Guskey, 2000). Conversely,
professional development programs that fail to develop the requisite knowledge and skills are viewed negatively and
considered a waste of time (Lindstrom & Speck, 2004).
The effects of follow-up and collaboration were also instrumental in enabling librarians to complete their
plans. As TAKS weaknesses at the schools were identified, information was provided to school librarians to support
the creation of their plan. Individual needs were addressed through feedback from the instructor and peers.
Distributing the course over time, allowed librarians to reflect on and embrace taking a purposeful role in moving
beyond the traditional role of supporting English and Language arts and begin addressing those areas where students
performed weakly on the TAKS. Providing opportunities for collaboration with other school librarians within the
164
course was the second factor contributing to the differences in completion rates. Ideally librarians work within a
school community as a member of the instructional team. However, since there is usually only one librarian on a
campus, many librarians feel isolated in dealing with issues related to their practice. Providing a network of support
through peers enabled librarians to gain from different perspectives as well as to work through the issues related to
their plan.
Conclusions
Educators face a constant challenge to maintain their proficiency with effective teaching and learning
practices. Daily, they must tackle new curriculums, pedagogies, technologies, and an increasingly diverse student
body. Professional development becomes a critical component in enabling schools to meet these challenges. Yet,
millions of dollars have been allocated for professional development with little to show for the money. Previous
research demonstrates that professional development aligned with traditional methods will not yield the results that
are needed to address the broader problems that are facing schools in the United States today.
Online delivery of professional development is a fast-growing industry especially for populations of
educators with limited access to professional development directed to their special needs. Course management
systems such as WebCT offer increasingly sophisticated platforms that provide many of the affordances of face to
face instruction. This research demonstrates that professional development aligned with two research-based
strategies, online follow-up and online collaboration, support professional development completion.
References
Beeby, C.E. (1980) The Thesis of the Stages Fourteen Years Later, International Review of Education, 26(4), pp.
451-474
Blase, J. and Blase, J. (2004, 2nd edition), Handbook of Instructional Leadership – How Successful Principals
Promote Teaching and Learning, Corwin Press.
Cannings, T. (2003).Online Professional Development for Teachers. Media & Methods39(4),14, 16
Chrislip, D.D. (2002). The Collaborative Leadership Fieldbook. San Francisco:Jossey-Bass.
Fullan, M. F.& Stiegelbauer, S., 1991, The New Meaning of Educational Change, Cassell, London.
Garet, M. S., Porter, A. C., Desimone, L., Birman, B. F., & Yoon, K. S. (2001, Winter). What makes professional
development effective? Results from a national sample of teachers. American Educational Research
Journal, 38(4), 915-945.
Garmston, R. J. (2003). Group wise. Journal of Staff Development v. 24(4):65-6.
Guskey, T. R. (2000). Evaluating professional development. Thousand Oaks, CA: Corwin Press.
Joyce, B. and Showers, B. (1988). Student achievement through staff development. New York: Longman.
Joyce, B., Showers, B., & Rolheiser-Bennett, C. (1987). Staff development and student learning: A synthesis of
research on models for teaching. Educational Leadership, 45 (2), 11-15.
Killion, J.P. (2000) Log on to learn. Journal of Staff Development, 21(3) 48-53
Lieberman, A. 1995. Practices that support teacher development. Phi Delta Kappan 76(8): 591-96.
Lindstrom, P. H., & Speck, M. (2004). The principal as professional development leader. Thousand Oaks, CA:
Corwin Press.
Schrum, L. (2001, May). Professional development by design: Distance education for educators. Retrieved July 22,
2004, from PBS Teacherline Web site
http://teacherline.pbs.org/teacherline/resources/articles/article_may01.cfm
Showers, B., Joyce, B., & Bennett, B. (1987). Synthesis of research on staff development: A framework for future
study and a state-of-the-art analysis. Educational Leadership, 45(3), 77-87
Smith, B. L. And MacGregor, J.T. (1992). What is collaborative learning? In Collaborative Learning: A
Sourcebook for Higher Education, pages 9 - 22. National Center on Postsecondary Teaching, Learning, and
Assessment, Pennsylvania State University.
165
A Cross-Media and Cross-Cultural Study on the Effect of Student-Generated
Visualization as a Study Strategy
Yi-Chuan Jane Hsieh
Ching Yun University
Jung-Li, 320, Taiwan
Lauren Cifuentes
Texas A&M University
Abstract
We compared student-generated visualization as a study strategy with unguided study strategy for middle
school science concept learning. The relative effectiveness of visualization on paper and on computers and the
differential impact of visualization training for Taiwanese and Texan 8th grade students were also investigated. We
analyzed data collected in school settings quantitatively and qualitatively. The results showed that Taiwanese and
Texan students who received visualization workshops and constructed visualizations on paper or on computers
during study time scored significantly higher on a comprehension posttest than those students who applied an
unguided study strategy. Overall, Taiwanese students scored higher than Texan students, but there was no
interactive effect of visualization and cultural background on test scores.
Theoretical Background
Constructivist learning theory contends that learning occurs when people actively construct their own
knowledge and think reflectively about concepts (Lee, 1997). Constructivist pedagogy focuses on "developing the
skills of the learner to construct (and reconstruct) plans in response to the situational demands and opportunities"
(Duffy & Jonassen, 1991, p. 9). In such a theoretical framework, educators advocate that students should be
encouraged to generate their own visual representations of knowledge, instead of passively receiving ready-made
illustrations in textbooks or from instructors (Cifuentes, & Hsieh, 2004a, 2004b, & 2004c; Hall, Bailey, & Tillman,
1997; Schwartz, 1993).
Student-generated visualization refers to graphical or pictorial representations of content showing the
sequential, causal, comparative, chronological, oppositional, categorical, or hierarchical relationships among
concepts, whether hand drawn on paper or created by students on computers (Cifuentes, & Hsieh, 2004a, 2004b, &
2004c; Wileman, 1993). Students’ study notes are visualizations if the information is presented in the form of
diagrams, matrices, charts, trees, tables, graphs, pyramids, causal chains, timelines, or direct representations
(Cifuentes & Hsieh, 2001; Schwartz, 1993; Tufte, 1990). Previous studies have provided evidence for the
effectiveness of student-generated visualization in the improvement of students’ test performances (Cifuentes, &
Hsieh, 2004a, 2004b, & 2004c; Gobert & Clement, 1999; Hall, Bailey, & Tillman, 1997).
Technology may play a role in the impact of visualization on learning. Computer graphics software allows
students to draw and paint objects and visually organize and represent what they know. The use of computers in the
externalization of students’ knowledge structures enriches the individuals’ mental models for organizing, retrieving,
and using knowledge (Williamson, Jr., 1999). Computer-based visualization tools can be regarded as “mindtools” to
extend and reorganize learners' cognitive structures during learning (Jonassen, 2000, Lajoie, 1993). Computer-
generated graphics created by learners offer several advantages over pen and paper, such as ease of subsequent
revision and the generation of sophisticated looking graphics by students with undeveloped artistic skills.
Culture may also play a role in the impact of visualization on learning. Researchers, such as Dunn, et al.
(1990), Hillard (1989), More (1990), and Vasquez (1990) indicated that different cultural groups differ in their
sensory modality strength. Asian students are found to have a stronger preference for visual learning than Anglo
students. The use of different language writing systems has an impact on Western and Chinese learners’ modes of
representation of information. Western learners tend to be verbalizers who “consider the information they read, see,
or listen to, in words or verbal associations” whereas Chinese learners tend to be imagers who “experience fluent
spontaneous and frequent pictorial mental pictures when they read, see, listen to or consider information” (Riding,
1994, p. 48). Studies indicate that verbalizers prefer text, while imagers prefer pictorial information during their
learning processes (Riding & Douglas, 1993). Asian students may favor the use of visualization as a study strategy
over other cultural groups since the visualization task falls into their visual learning preference. Because of cultural
and language differences, the use of student-generated visualization as a study strategy may differentially affect
166
science concept learning for Taiwanese and American students.
Objectives
We sought evidence regarding the effect and impact of student-generated visualization on paper and on
computers during study time for Taiwanese and Texan 8th grade students’ science concept learning. To provide
such evidence, we compared student-generated visualization as a study strategy with unguided study strategy. The
relative effectiveness of visualization on paper and on computers and the differential effect and impact of
visualization training for Taiwanese and Texan students were also investigated.
Methods
Mixed methods were applied, including both experimental and naturalistic analyses of data collected in a
school setting. The school teachers delivered the visualization training, and the researchers participated in the
management of the treatments of students in the context of the students’ typical class periods. Both the teachers and
the researchers kept reflective journals during the week of the experiment. Following the treatments, students’
scores on a comprehension posttest were compared across groups. In addition, journal entries were used to gain
understanding of the natural classroom learning environment that contributed to or diminished the generalizability of
the findings (Shulman, 1997). Content analyses approaches, as described by Emerson, Fretz, and Shaw (1995),
were applied to the journal entries. During and upon completion of data collection, we used the two-phase process
of content analyses, open coding and focused coding, to analyze the data.
Participants
The original sample selected to participate in the study was 105 Texan eighth graders and 70 Taiwanese
eighth graders (13-14 years old) from rural public middle schools. However, several participants were absent for
part of the treatment or were absent for testing, and only 92 Texan 8th graders and 60 Taiwanese 8th graders
completed the entire study.
Design
A posttest-only control group design was used in the study. The first independent variable, “treatment,”
had three levels—(1) the control group that received a non-visualization workshop and applied an unguided study
strategy during study time, (2) the experimental/paper group that received the paper-form visualization workshop
training and visualized on paper during study time, and (3) the experimental/computer group that received the
computer-based visualization workshop training and visualized on computers during study time. The second
independent variable, “cultural background,” contained two levels: Texan and Taiwanese. The dependent variable
was a comprehension posttest score for science concepts studied after treatment.
Six Texan 8th grade intact science classes were randomly assigned to three treatment groups: the control
group, the experimental/paper group, and the experimental/computer group. Further, two Taiwanese 8th grade intact
science classes were randomly assigned to two treatment groups: the control group, and the experimental/paper
group. Computer labs were not available to the Taiwanese students in the middle school; therefore, there was no
experimental/computer group for the Taiwanese participants.
All participants received the given treatments as part of their curricular activity in their science classes.
Chi-square tests showed no significant differences among groups in terms of their gender, age, and prior knowledge
of content on the comprehension test. Texan classes contained no Asian students and Taiwanese classes were
exclusively Asian. Texan groups did not differ significantly in their ethnicity or frequency of using a computer at
school and at home.
Procedures
Students in the control groups watched science-related videotapes that did not address visualization as a
study strategy. The experimental/paper groups attended 100-minute visualization workshops in which they learned
how to visualize on paper in their regular classrooms; and the experimental/computer group attended the 100-minute
computer-based visualization workshop in which they learned how to visualize on computers in the computer lab.
In the visualization workshops students were instructed to recognize cause-effect, sequence, and
comparison-contrast relationships in text and use visual conventions to represent those text structures. The
instructor first modeled visualization processes and then scaffolded the learners (with advice and examples), guided
them during practice, and gradually tapered off support and guidance until each student visualized alone. Students
practiced using path diagrams to represent causal relationships (Schwartz, 1993), matrices to compare one kind of
167
concept to another (Cifuentes, 1992), and flow charts to point out stages in a chain of events (Wileman, 1993;
Cifuentes & Hsieh, 2001).
After the workshops, the control groups were given several science essays for unguided and independent
study. The experimental/paper groups and the experimental/computer group were given the same science essays to
study as were the control groups. However, they were asked to use their learned visualization skills to create visual
representations that showed interrelationships among concepts on paper or on computers during their study time.
Students were given a comprehension posttest to measure their understanding of the concepts in the texts when the
prescribed study time was over.
Five science essays were excerpted from a Taiwanese biology textbook for 9th graders. The contents were
translated into English with reference to the American biology textbook, Biology: the web of life (Strauss &
Lisowski, 1998), adopted for the 9-12 grade Texas science curriculum. The illustrations were eliminated in order to
create the text-based document. The higher level reading passages were used to assure a high level of difficulty and
a lack of student exposure to the content.
Thirty test items were derived from the Taiwanese Test Bank for Middle School Biology that measured
students’ understanding of relevant biological concepts specified in the science essays used in this study. They were
constructed and validated by three Texan content experts to be appropriate for this study. The comprehension
posttest had a reliability coefficient of 0.71 (coefficient alpha).
Results
The two-way ANOVA results indicated that there was a significant main effect on the type of treatment, F
(1, 114) =38.893, p < .05. Students from both cultural backgrounds who received the paper-form visualization
workshops and constructed visualization on paper during study time scored significantly better on the
comprehension posttest than did students who received the non-visualization workshops and applied the unguided
study strategy during study time (See Table 1).
Table 1.Two-way ANOVA Summery Table for the Effects of the Type of Treatment and
Cultural Background on the Comprehension Posttest Scores
* p < .05.
The experimental/paper groups yielded a mean score of 56.5 (SD= 10.74); however, the control groups
produced a mean score of 44.34 (SD= 11.702). Across cultures, students’ learning of visualizing skills and the
generation of visualization on paper during study time resulted in a positive effect on science concept learning with
a large effect size (d = 1.08) (See Table 2).
Table 2.Means and Standard Deviation on Comprehension Posttest Scores Across Cultures
Groups N Mean Standard Skewness Kurtosis
Deviation
Control Groups 58 44.34 11.702 0.009 -0.592
168
The one-way ANOVA result indicated that there was a significant difference among the three treatment
groups on the mean scores of the comprehension posttest, F (2, 89)= 20.363, p<. 05. The Texan experimental/paper
and Texan experimental/computer groups both produced higher scores on the comprehension posttest than did the
Texan control group. The Tukey HSD post hoc test result revealed that there were two pairs of Texan groups whose
means differed significantly from each other at the p< .05 level. The Texan experimental/paper group and the Texan
experimental/computer group scored significantly higher than did the Texan control group. However, the students
in the experimental/paper group performed as well as those students who were in the experimental/computer group
(See Table 3).
Table 3. One-way ANOVA Summary Table for the Effect of Type of Medium for
Generating Visualization on the Comprehension Posttest Scores
Source df Mean squares F-ratio p
Sum of squares
Between Groups 3899.571 2 1949.786 20.363 0.000*
Within Groups 8521.929 89 95.752
Total 12421.500 91
Compared to the mean score of the Texan experimental/computer group, 51.676 (SD= 8.832) and the mean
score of the Texan experimental/paper group, 54.950 (SD= 8.775), the Texan control group yielded a lower mean
score of 39.375 (SD=11.730). Cohen's d indicated a positive large effect size (d=1.51) for the pairwise comparison
of those Texan participants who were in the experimental/paper group, and of those who were in the control group.
Additionally, Cohen's d for the pairwise comparisons of those Texan students who were in the
experimental/computer group and those who were in the control group was 1.20, a large positive effect size. The
effect sizes indicated that the treatments that trained students to visualize on paper and on computers both had
positive effects on the Texan 8th grade students’ comprehension of science concepts.
Additionally, cultural background had a significant main effect on these scores, F (1, 114) =10.303, p< .05
(Table 1). Overall, the Taiwanese students significantly outperformed the Texan students on the comprehension
posttest. All of the Taiwanese participants produced a mean score of 53.508 (SD=11.947); in comparison to all the
Texan students who yielded a mean score of 47.431 (SD=12.884).
There was no interactive effect of the treatment by the cultural background, F (1, 114) = 2. 696, p=0.103)
(See Table 1). The effect of changing levels of treatment (the control group vs. the experimental/paper group) did
not depend on the type of cultural background, and the effect of changing levels of cultural background (the Texans
vs. the Taiwanese) again did not rely on the type of treatments.
Qualitative Findings
Because the degree to which students attended to the learning tasks is an important factor for learning
(Bransford, 2000), it was essential to understand both the way that students used their study time, and their
classroom behaviors. The Texan experimental/paper group was found to be more attentive to the learning tasks than
the Texan control group and the Texan experimental/computer group. Furthermore, the Taiwanese participants were
more conscious of their cognitive processes than the Texan counterparts. Concentration and persistence
characterized the learning process of the Taiwanese students in the experimental/paper group.
Several factors might account for the overall better performance of the Taiwanese participants. First, the
Taiwanese students often felt shame or guilt about poor learning. Many Taiwanese participants expressed that they
would force themselves to study regardless of whether the assigned reading material was challenging, boring, or
interesting.
Moreover, the Taiwanese science teacher tended to give specific directions and close guidance, and was
strict with the students regarding behavior. There was an authoritative relationship between the teacher and the
students. Discipline problems were not commonly found in the Taiwanese classrooms during the course of this
study. However, the Texan science teacher in this study spent a lot of time on classroom management, especially
169
with the Texan experimental/computer group. With that group, the teacher spent most of the class time roaming to
control student behaviors while the researcher led students in the workshop. In the Texan experimental/paper group,
the teacher spent less time on classroom management. The comparison between the need for classroom
management in the two Texan experimental groups indicated that computers might be distracting to the students. In
order to restrict Texan students’ bad or offensive behaviors in class, students’ conduct behaviors were graded.
American students generally did not comply with the teacher’s directions to the extent that Taiwanese students
did— highlighting a cultural difference (Biggs, 1994; Bond, 1991; Hess & Azuma, 1991).
Conclusion
Student-generated visualization had a positive impact on both Texan and Taiwanese 8th graders’ science
concept learning, but it did not impact one cultural group more than the other. Visualization as a study strategy had
positive effects whether students used paper or computers to generate their visualizations. But students who used
computers to visualize did not outperform those who used paper on the science comprehension posttest. These
findings extend the accumulating evidence on the effectiveness of student-generated visualization in the
improvement of students’ concept understanding and furthered our understanding of cross-cultural learning.
Student-generated visualization on paper and on computers provides a means for students to construct meanings of
science content.
However, an orientation to visualization skills is necessary to prepare students for using visual techniques
to represent interrelationships that are causal, sequential, and comparative. With the instructor and the researcher's
guidance, students who construct visualization during study time can process the information accurately, and they
feel more confident in their own knowledge. It is suggested that educators help young unsophisticated learners
develop expertise in how to learn, so they can use that expertise to construct useful knowledge within each subject
domain.
References
Biggs, J. (1994). Asian learners through Western eyes: An astigmatic paradox. Australian and New Zealand Journal
of Vocational Education, 2 (part2), 40-63.
Bond, M. H. (1991). Beyond the Chinese face: Insights from psychology. Hong Kong: Oxford University Press.
Bransford, J. D. (Eds). (2000). How people learn: Brain, mind, experience, and school. Committee on
Developments in the Science of Learning and Committee on Learning Research and Educational Practice,
Commission on Behavioral and Social Sciences and Education, National Research Council. Washington, DC.
Cifuentes, L. (1992). The effects of instruction in visualization as a study strategy. Unpublished doctoral
dissertation, University of North Carolina, Chapel Hill.
Cifuentes, L., & Hsieh, Y. C. (2001). Computer graphics for student engagement in science learning. TechTrends,
45(5), 21-23.
Cifuentes, L., & Hsieh, Y. C. (2004a). Visualization for middle school students' engagement in science learning.
Journal of Computers in Mathematics and Science Teaching, 23(2), 109-137.
Cifuentes, L., & Hsieh, Y. C. (2004b). Visualization for Construction of Meaning during Study Time: A
Quantitative Analysis. International Journal of Instructional Media, 30(3), 263-273.
Cifuentes, L., & Hsieh, Y. C. (2004c). Visualization for Construction of Meaning during Study Time: A Qualitative
Analysis. International Journal of Instructional Media, 30(4), 407-417.
Duffy, T. M., & Jonassen, D. H. (1991). Constructivism: New implications for instructional technology?
Educational Technology, 31(5), 7-12.
Dunn, R., Gemake, J., Jalali, F., Zenhausern, R., Quinn, P. & Zenhausern, P. (1990). Cross-cultural differences in
the learning styles of fourth-, fifth, and sixth-grade students of African, Chinese, Greek, and Mexican
heritage. Journal of Multicultural Counseling Development, 18, 68-93.
Gobert, J. D., & Clement, J. J. (1999). Effects of student-generated diagrams versus student-generated summaries on
conceptual understanding of causal and dynamic knowledge in plate tectonics. Journal of Research in
Science Teaching, 36(1), 39-53.
Hall, V., Bailey, J., & Tillman, C. (1997). Can Student-Generated Illustrations Be Worth Ten Thousand Words?
Journal of Educational Psychology, 89(4), 677-81.
Hess, R. D., & Azuma, M. (1991). Cultural support for schooling: Contrasts between Japan and the United States.
Educational Researchers, 20(9), 2-8.
Hillard, III., A. G. (1989). Teachers and cultural styles in a pluralistic society. NEA Today, 7(6), 65-69.
Jonassen, D. H. (1996). Computers in the classroom: Mindtools for critical thinking. Englewood Cliffs: Prentice-
Hall.
170
Jonassen, D. H. (2000). Computers as mindtools for schools: Engaging critical thinking. Upper Saddle River:
Merrill.
Lajoie, S. P. (1993). Computer environments as cognitive tools for enhancing learning. In S. P. Lajoie & S. J. Derry
(Eds.). Computers as cognitive tools (pp.261-288). New Jersey: Lawrence Erlbaum Associates.
Lee, P-L. H. (1997). Integrating concept mapping and metacognitive methods in a hypermedia environment for
learning science. Unpublished doctoral dissertation, West Lafayette: Purdue University.
More, A. J. (1990). Learning styles of Native Americans and Asians. Paper presented at the Annual Meeting of the
American Psychology Association, 98th, Boston, MA. (ERIC Document Reproduction Service No. ED 330
535).
Riding, R. J. (1994). Cognitive styles analysis. Birmingham, England: Learning and Training Technology.
Riding, R. J., & Douglas, G. (1993). The effect of learning style and mode of presentation on learning performance.
British Journal of Educational Psychology, 63, 273-279.
Schwartz, D. L. (1993). The construction and analogical transfer of symbolic visualization. Journal of Research in
Science Teaching, 30(10), 1309-1325.
Shulman, L.S. (1997). Disciplines of inquiry in education: A new overview. In R.M. Jaeger (Ed.), Complementary
methods for research in education (pp. 3-29). Washington, D.C.: American Educational Research
Association.
Strauss, E., & Lisowski, M. (1998). Biology: the web of life. Menlo, Calif.: Scott Foresman-Addison Wesley.
Tufte, E. R. (1990). Envisioning information. Chelshire: Graphics Press.
Vasquez, J. A. (1990). Teaching to the distinctive traits of minority students. The Clearing House, 63, 299-304.
Wileman, R. E. (1993). Visual communicating. Englewood Cliffs: Educational Technology Publications.
Williamson, J. W., Jr. (1999). Mental models of teaching: Case study of selected pre-service teachers enrolled in an
introductory educational technology course. Unpublished doctoral dissertation, Athens: University of
Georgia.
171
Self-Regulation in Web-Based Distance Education: From a Requirement to an
Accomplishment
Haihong Hu
Florida State University
In web-based distance learning courses, individuals are able to participate at their convenience with little to
no supervision. The learner control inherent in these courses is usually considered as a positive feature to enhance
motivation (Reeves, 1993). However, research has shown that learner control is associated with a number of
negative outcomes, such as less time spent on task and the use of poor learning strategies (K. G. Brown, 2001;
Williams, 1993). Bernt & Bugbee (1993) found that distance learning students who have not completed college are
at risk because they lack metacognitive or executive skills for approaching coursework and taking examinations. A
number of researchers (Keller, 1999; McMahon & Oliver, 2001; Zimmerman, 2000) have proposed utilizing self-
regulatory strategies, particularly goal setting (Keller, 1999) and self-evaluation (McMahon & Oliver, 2001), to
promote online learners’ motivation and ultimately learning. Some other researchers (Ley & Young, 2001) have
even provided specific guidelines for implementing these strategies in an online setting. Due to the importance of
completion and achievement in distance education, strategies that promote motivation and learning warrant
investigation.
Self-Regulated Learning
What is Self-Regulated Learning?
Zimmerman (1990) defines self-regulated learning with three distinctive features: learners’ application of
self-regulated learning strategies, their sensitivity to self-evaluative feedback about learning effectiveness, and their
self-generated motivational processes. He (Zimmerman, 1998) differentiates academic self-regulation from a mental
ability, such as intelligence, or an academic skill, such as reading proficiency. He suggests that it is a “self-directive
process through which learners transform their mental abilities into academic skills” (Zimmerman, 1998, p. 2).
From a social cognitive point of view, self-regulatory processes and beliefs consist of three cyclical phases:
forethought, performance or volitional control, and self-reflection.[ (Zimmerman, 1998, 2000) According to
(Zimmerman, 1998)the forethought phase happens before efforts to learn, and sets the stage for learning.
Performance or volitional control processes occur during learning efforts, and concerns concentration and
performance. Self-reflection processes take place after learning efforts and affect learners’ reactions to that
experience. As a result, these self-reactions completes the self-regulatory cycle by influencing forethought of
subsequent learning efforts (Zimmerman, 1998).
Similarly, the information processing point of view (Butler & Winnie, 1995) summarizes the cognitive
processes central to self-regulation as that self-regulated learners engage in this sequence of processes reflectively,
flexibly, and recursively. They continually reformulate their learning activities as they plan, monitor, and modify
their engagement in learning tasks (Butler & Winne, 1995 as cited in Butler, 1998).
Self-Regulated Learners
“All learners try to self-regulate their academic learning and performance in some way”, but there are
remarkable differences among students (Zimmerman, 1998, p. 6).
Expert learners are strategic, self-regulated, and reflective. (Ertmer & Newby, 1996) They demonstrate
planfulness, control, and reflection; they are aware of the knowledge and skills they have, or are missing, and use
appropriate strategies to actively apply or acquire them.
Less effective self-regulated learners often have trouble monitoring and regulating their cognition. They
are not aware of their loss of attention and comprehension, and they do not self-evaluate their comprehension. They
often set distal and global goals, which can interfere with their learning. They sometimes have problem regulating
their motivation and affect for learning. They may doubt their ability to succeed in studying, and they may have high
test-anxiety. Their high level of anxiety may cause them to use simple cognitive strategies, such as memorization,
rather than deeper processing strategies for learning. They often compare with other students instead of with their
own previous performance (Pintrich, 1995).
Therefore, students’ level of self-regulation eventually decides whether their learning experiences will
become destructive or fulfilling. Once established, personal cycles of self-regulation, whether skillful or naïve, are
difficult to change without using interventions to address their inherent qualities Awareness of the importance of
self-regulation is the foundation for students to assume the responsibility for their own academic achievement.
(Zimmerman, 1998).
172
All learners are self-regulated to some degree, but effective learners are distinguished by their awareness of
the relationship between self-regulatory strategies and learning outcomes and their use of these strategies to reach
their academic goals (Zimmerman, 1990). Systematic use of metacognitive, motivational, cognitive and/or
behavioral strategies is a key feature of most self-regulated learners.
173
environment. He divided self-regulatory strategies into four subcategories of metacognitive, cognitive, self-
management and motivational strategies. Planning and monitoring (including self-evaluation) are classified in the
metacognitive category. Results showed that metacognitive and motivational strategies significantly influenced the
prediction of achievement, while cognitive and self-management strategies did not show significant effects.
Metacognitive and motivational strategies were the most influential on achievement in a computer-networked
hypertext/hypermedia learning environment. The researcher also suggested that learners' should develop
metacognitive and motivational strategies before they study with such learning environment and instructional
designers might consider integrating metacognitive and motivational strategies into a computer-networked
hypertext/hypermedia learning environment.
Sankaran & Bui (2001) investigated how learning strategies and motivation influence performance in Web
and face-to-face lecture settings of a business information systems course. Learning strategies and motivation beliefs
were measured using a survey instrument, learning performance by test scores. The researchers found that using
either deep or surface learning strategy leads to equivalent positive performances, but undirected strategy, such as
cramming for exams at the last minute, affects performance negatively. While motivation is significantly correlated
to performance in both web and face-to-face settings, the relationship is stronger in the Web setting. High
motivation is related with the use of deep learning strategy, and low motivation with undirected strategy.
Greene, Dillon, & Crynes (2003) conducted a study to examine student performance and approaches to study in a
CD-ROM version of a chemical engineering course. This study consisted of three phases. Phase 1 was a formative
evaluation of the CD-ROM approach, and the result supported the validity of the CD-ROM based instruction. In
phase two, the researchers interviewed both successful and less successful students in the course to examine any
differences in their strategies for learning the content. Researchers found differences consistent with a surface versus
deep approach to studying. During the third phase, the researchers used an Approaches to Learning Instrument with
a new group of students to determine the factors that contribute to success in the CD-ROM version of the course.
Results illustrated that deep cognitive engagement and motivation, defined in terms of goals and self-efficacy, were
significant predictors of success based on two indices of course performance.
Sankaran & Bui (2001) summarizes that “students who choose distance education need a high level of
motivation if they are to complete the course work successfully. During their studies, they often have to work by
themselves with little or no opportunities for face to face interaction. They will have to deal with more abstract and
ambiguous situations than someone taking a lecture class. They need to be efficient in time management, be
responsible and in control of their studies and maintain an image of self-worth and self-efficacy. They should see the
value of education and be able to postpone current enjoyments and cope with interruption life frequently entails.”(p.
193). It is not surprising that Greene et al. (2003) suggested “although technology provides opportunities for learners
to learn in increasingly independent environments, educators need to prepare students to learn independently using
newer electronic technologies.”(p. 2). In addition, she recommended that instruction should promote more
metacognitive processing throughout the modules, and we must also incorporate approaches to help students learn
how to learn when designing more independent learning environments because the growth of distributed learning in
education will continue place more responsibility for learning upon the learner.
174
could make “nonorganizers” aware of the structure and embedded processes within the distance education format,
and the unique aspects of themselves as learners that may assist or hinder their academic progress within that format.
It was expected (Atman, 1990)that training in self-regulatory skills might enhance the potential for "nonorganizers"
to become academically successful.
Similarly, in the case of students’ withdrawal from the distance program offered by the Department of
Instructional & Performance Technology (IPT) at Boise State University (Chyung, 2001) they applied the
Organizational Elements Model (OEM) (Kaufman, 1988, 2000)and the ARCS model (Keller, 1987) to redesign their
online instruction and were able to reduce the attrition rate greatly from 44% in fall 1996 to 15% by the end of the
1999-2000 academic year. Several items within their interventions based on the ARCS model were designed to
enhance the goal- and performance-oriented characteristics of online learners. They stated weekly goals clearly and
explained why it was important to achieve the goals; defined clearly how learners will be assessed; offered specific
guidance on how to successfully achieve the goals; and provided concrete and constructive feedback on how they
are doing in a timely manner.
Online instruction requires students to develop a stronger sense of competence through completing self-
directed assignments (Parker, 2003). Self-regulated learning strategies are critical in assisting students to cultivate
the self-management skills, which are necessary in independent study. In addition, it has been illustrated that self-
motivation and even student persistence can be enhanced with well-designed online instruction having an emphasis
on goal-related behaviors. Since students must be responsible for their own time management, skill building, and
eventual academic success, web-based instruction lends itself to the belief that self-regulated learning strategies
might be one of the answers to resolve the mystery of high attrition rate and low motivation.
175
Kauffman (2003)conducted a study to investigated strategies teachers can use to improve self-regulated
learning in a web-based setting. In this study self-regulated learning is defined as “a learner’s intentional efforts to
manage and direct complex learning activities” and consists of three major components including “cognitive
strategy use, metacognitive processing, and motivational beliefs”. These three components are operationalized
corresponding to “note-taking methods (cognitive component), self-monitoring prompts (metacognitive component),
and self-efficacy building statements (motivation component)”. One hundred nineteen participants were randomly
assigned to each cell in a 2x2x2 design. Students took notes either in a matrix or a free form method from a web site,
which teaches educational measurement. And they were either provided or not provided self-monitoring prompts
and self-efficacy building statements. Findings revealed note-taking method had the strongest impact on both the
amount of information gathered and achievement. Moreover, both academic self-efficacy building statements and
self-monitoring prompts displayed modest effects on achievement.
Principle 1: Promote Learners’ Metacognitive Awareness of Their Behavior, Motivation, and Cognition
First, it is suggested (Pintrich, 1995) that for students to become self-regulated learners, it is necessary that
they become more aware of their behavior, motivation, and cognition by contemplating on these aspects of their
learning. Self-reflection is a fairly hard task for most individuals. Students are required to have better awareness of
their own behavior, motivation, and cognition (Pintrich, 1995). Hattie, Biggs, & Purdie (1996) also recommended
that training for complex learning strategies should promote a high degree of learner activity and metacognitive
awareness. The researchers [Hattie, 1996 #144]suggest that the student need to know not only what those strategies
are, but also the conditional knowledge, the how, when, where, and why of their use, which enables the student to
make use of the strategies effectively. It is impossible for learners to observe their own behaviors, detect flaws and
modify their learning strategies if they do not have a keen perception of their own motivation, and cognition.
176
Metacognitive awareness is the prerequisite for learners to be able to learn from and utilize the self-regulated
learning strategies that interventions are intended to teach.
It is suggested (Pintrich, 1995) that standardized assessment instruments, such as the Motivated Strategies
for Learning Questionaire (Pintrich, Smith, Garcia, & McKeachie, 1991)or the Learning and Study Strategies
Inventory (Weinstein & Palmer, 2002), can be used to provide students with an overview of their motivational
beliefs and learning strategies.
Principle 3: Provide Ample Opportunities for Learners to Practice Self-Regulated Learning Strategies and Feedback
about Strategy Effectiveness.
Third, two crucial elements are practice of self-regulatory strategies and feedback on strategy effectiveness.
These mechanisms facilitate learning and motivation by communicating learning progress, and they also enhance
strategy transfer and maintenance (Schunk & Zimmerman, 1998)
Pintrich (1995) also pointed out the need for students to practice self-regulatory learning strategies. He
commented that developing into a self-regulating learner could not be accomplished in a short period of time.
Students need time and opportunity to cultivate their self-regulatory strategies. Classroom tasks can be adapted as
opportunities for student self-regulation. Other tasks that college students deal with should also be arranged to
provide them with opportunities for self-regulation. For example, a professor can provide students with “a selection
of essay questions or topics within a given list”, which “allows students some control over their work while
preserving integrity of the curriculum content” (Pintrich, 1995, p. 9).
Ertmer & Newby (1996) echoes this notion by claiming it is not sufficient to simply inform students what
expert learners know or even to display the actions that expert learners take because a great deal of what they know
and do is not observable nor easily available to the student. Even if a student completely understands the expert
learning process declaratively, extensive practice is still needed for him to implement it automatically and
effectively. Therefore, extensive long-term practice and feedback are considered vital for the development of expert
learning. Many years of performing the metacognitive and regulatory skills in the environment of meaningful
learning activities are the only way to develop expertise in learning (Ertmer & Newby, 1996).
177
Principle 4: Incorporate Motivational Processes, Especially Positive Beliefs, into Instruction.
Fourth, motivation plays an important role in developing self-regulated learning. To participate in self-
regulation requires that students have the motivation or willingness to learn over extensive periods (Schunk &
Zimmerman, 1998). Motivation is essential because it plays a mediating role to the effect of self-regulated learning
strategies.
It is necessary for students to have positive motivational beliefs (Pintrich, 1995). Having a mastery
orientation and focus on learning and understanding the material is much more facilitative for self-regulated
learning. Another motivational belief that promotes self-regulated learning is positive self-efficacy for learning.
Self-efficacy beliefs should be neither excessively pessimistic nor overly optimistic. Students should have a
reasonably accurate and positive idea that they can learn and master course material (Pintrich, 1995). Providing
information about strategy value also serves as a valuable motivational function (Schunk & Rice, 1992) as cited
in(Schunk & Zimmerman, 1998).
Ironically, Hattie et al. (1996) pointed out that affect is much more open to change by intervention. Study
skill training is usually more effective in enhancing attitudes than in improving study skills. In addition, the most
striking improvements in the affective domain happen with attribution training. In attribution training students are
taught to change their attributions for success and failure from maladaptive, which is success due to effort and
failure to lack of ability, to adaptive ones, which is success due to ability, failure to lack of effort. When the purpose
is to change students’ attributions, teachers should stress the importance of using appropriate strategies
systematically. Teachers’ feedback to students about their use of strategies will probably affect their attributions
more than will feedback concerning either ability or effort (Hattie et al., 1996).
Principle 5: Integrate Self-Reflective Practice systematically with instruction on self-regulated learning strategies.
Finally, across interventions there is an emphasis on self-reflective practice, “where student practice skills
and reflect on their performance”. Self-reflective practice often is built into the instructional procedure with
independent practice or time for self-reflection (Schunk & Zimmerman, 1998).
Schunk & Zimmerman (1998) also pointed out that, preferably, self-reflective practice offers students
opportunities to review their learning process and the effectiveness of strategies, modify their approach as needed,
and make alteration to environmental and social factors to set up a conducive setting to learning. The need for self-
reflective practice may vary based on settings. Self-reflective practice may be less imperative where feedback is
provided frequently and self-assessment is clear-cut. In environments with less structure, student self-reflection may
have more helpful functions. Systematic forethought, such as assuming a learning goal orientation gets a student
ready for most effective forms of self-reflection, for instance a strategy rather than a fixed ability attribution. As a
result, it is suggested (Schunk & Zimmerman, 1998)that “self-reflection can be systematically developed by training
in forethought and performance or volitional control” (p. 230).
178
the likelihood of transfer by persuading students to be metacognitive and reflective about their strategy use in not
only the study skill course, but also other disciplinary courses.
Similarly, Osman & Hannafin (1992) discusses detached content-independent strategies (DCIS), which are
generic strategies taught independently, without relationship to particular lesson content. DCIS methods focus on
diverse contexts and lesson content for applying strategies during training, and support skills that are applicable to
various academic subjects and learning tasks. Therefore, they are necessary when strategy generalization is
preferred. The major purpose of DCIS approaches is to assist students to become independent learners gradually.
Some of the implications for designing DCIS instructions (Osman & Hannafin, 1992)include: 1) ensuring
that metacognitive strategies do not demand too much cognitive resources; 2) using more explicit and more implicit
strategies for younger versus older learners differently and use higher-order strategies for more mature learners and
those with relevant prior knowledge; 3) detaching metacognitive training and use various lesson content when far
transfer is desired; 4) stressing not only knowledge about strategies, but also methods for maintaining and
transferring strategies; 5) focusing on learner characteristic variables in addition to task and strategy variables; 6)
encouraging learners to share their experience by describing their learning processes, evaluating their performance,
and providing feedback to each other; 7) integrate matching strategies that include both transferable strategies and
direct manipulation of the lesson content.
Ley & Young (2001) also suggested principles for embedding support in instruction to promote regulation
in less expert learners. These principles were considered suitable for supporting self-regulation regardless of content,
media, or a specific population, and they could be utilized systematically in diverse settings including print-based,
instructor-led instruction, and synchronous or asynchronous Web-based instruction. These four principles (Ley &
Young, 2001)are based on research on self-regulation components, and can be adapted for adjunct self-regulated
learning strategy training: 1) directing learners to arrange an effective learning environment; 2) teaching learners to
organize instruction and activities to support cognitive and metacognitive processes; 3) guiding learners to use
instructional goals and feedback as opportunities for monitoring; 4) educating learners to seek opportunities for
continuous evaluation and chances to self evaluate.
Summary
In a learner-centered environment, such as a web-based online instruction, learners do not automatically
possess the metacognitive skills required to make independent judgments and selections about how to learn (C.
Brown, Hedberg, & Harper, 1994). However, Hofer et al.(1998)’s work suggests that an intervention that targets a
range of cognitive and motivational components can benefit college students who need help with metacognitive
skills. A meta-analysis (Hattie et al., 1996) recommended that strategy training should be a balanced system in
which “an individual’s abilities, insights, and sense of responsibility”(p. 131) are put into full function, so as to
employ appropriate strategies to tackle the task at-hand most effectively. If this is the case and the intervention is
effective, then the withdrawal of social support, (Schunk & Zimmerman, 1998) such as the use of mentors in web-
based online courses, may become possible when students become more competent in self-regulation.
Based on a review of the instructional design principles, a self-regulated strategy training, which is made
up of an adjunct web-based online instruction on self-regulated learning and opportunities for self-reflective practice
of self-regulated learning strategies, might be helpful to the improvement of distance learners’ metacognitive skills.
In the author’s vision, learners’ existing motivational issues can be addressed throughout the whole intervention, and
the instruction can be adapted to facilitate students’ revision of problematic self-regulated learning process. The
intervention can be a semester-long multistrategy program that teaches a range of cognitive, metacognitive, and
motivational strategies for students to have both the “skill” and the “will” to use the strategies properly. The
instruction can try to utilize the principle of modeling and self-explanation, and to contain ample opportunity for
exercise and feedback. Every step of the intervention should serve the purpose for promoting learners’
metacognitive awareness. With the assistance of an intervention like this, it might be possible to accomplish the task
of preparing web-based distance learners with the required self-regulated learning skills for academic success.
References
Atman, C. S. (1990). Psychological type elements and goal accomplishment style: Implications for distance
education. In Contemporary issues in American distance education. Oxford: Pergamon Press.
Azevedo, R., Comley, J. G., Thomas, L., Seibert, D., & Tron, M. (2003, April 21-25, 2003). Online process
scaffolding and students' self-regulated learning with hypermedia. Paper presented at the American
Educational Research Association, Chicago, IL.
Bernt, M. D., & Bugbee, A. C. (1993). Study practices and attitudes related to academic success in a distance
learning programme. Distance Education, 14(1), 97-112.
179
Brown, C., Hedberg, J., & Harper, B. (1994). Metacognition as a basis for learning support software. Performance
Improvement Quarterly, 7(2), 3-26.
Brown, K. G. (2001). Using computers to deliver training: Which employees learn and why? Personnel Psychology,
54(271-296).
Butler, D. L. (1998). A strategic content learning approach to promoting self-regulated learning by students with
learning disabilities. In D. H. Schunk & B. J. Zimmerman (Eds.), Self-regulated learning : From teaching to
self-reflective practice. New York, NY: The Guilford Press.
Butler, D. L., & Winnie, P. H. (1995). Feedback and self-regulated learning: A theoretical synthesis. Review of
Educational Research, 65(245-281).
Chyung, S. Y. (2001). Systematic and systemic approaches to reducing attrition rates in online higher education. The
American Journal of Distance Education, 15(3), 36-49.
Eom, W. (1999). The effects of self-regulated learning strategy on academic achievement in a computer-networked
hypertext/hypermedia learning environment. Florida State University, Tallahassee.
Ertmer, P. A., & Newby, T. J. (1996). The expert learner: Strategic, self-regulated, and reflective. Instructional
Science, 24, 1-24.
Greene, B. A., Dillon, C., & Crynes, B. (2003). Distributive learning in introductory Chemical Engineering:
University students' learning, motivation, and attitudes using a CR-ROM.Unpublished manuscript.
Hagen, A. S., & Weinstein, C. E. (1995). Achievement goals, self-regulated learning, and the role of classroom
context. New Directions for Teaching and Learning(63), 43-55.
Hartley, K. (2001). Learning strategies and hypermedia instruction. Journal of Educational Multimedia and
Hypermedia, 10(3), 285-305.
Hattie, J., Biggs, J., & Purdie, N. (1996). Effects of learning skills interventions on student learning: A meta-
analysis. Review of Educational Research, 66(2), 99-136.
Hofer, B. K., Yu, S. L., & Pintrich, P. R. (1998). Teaching college students to be self-regulated learners. In D. H.
Schunk & B. J. Zimmerman (Eds.), Self-Regulated learning: From teaching to self-reflective practice. New
York, NY: The Guilford Press.
Hythecker, V. I., Rocklin, T. R., Dansereau, D. F., Lambiotte, J. G., Larson, C. O., & O'Donnell, A. M. (1985). A
computer-based learning strategy training module: Development and evaluation. Journal of Educational
Computing Research, 1(3), 275-283.
Kauffman, D. F. (2003). Self-Regulated learning in Web-based environments: Instructional tools designed to
facilitate cognitive strategy use, metacognitive processing, and motivational beliefs.Unpublished
manuscript.
Kaufman, R. (1988). Preparing useful performance indicators. Training & Development Journal, 42, 80-83.
Kaufman, R. (2000). Mega planning: Practical tools for organizational success. Thousand Oaks, CA: Sage.
Keller, J. (1987). Development and use of the ARCS model of instructional design. Journal of Instructional
Development, 10(3), 2-10.
Keller, J. (1999). Motivation in cyber learning environments. International Journal of Educational Technology, 1(1),
7-30.
King, F. B., Harner, M., & Brown, S. W. (2000). Self-regulatory behavior influences in distance learning.
International Journal of Instructional Media, 27(2), 147-155.
Kitsantas, A., Zimmerman, B. J., & Cleary, T. (2000). The role of observation and emulation in the development of
athletic self-regulation. Journal of Educational Psychology, 92(4), 811-817.
Ley, K., & Young, D. B. (2001). Instructional principles for self-regulation. Educational Technology Research
&Development, 49(2), 93-103.
McMahon, M., & Oliver, R. (2001). Promoting self-regulated learning in an on-line environment. Paper presented at
the ED-Media 2001 World Conference on Educational Multimedia, Hypermedia & Telecommunications,
Tampere, Finland.
Orange, C. (1999). Using peer modeling to teach self-regulation. The Journal of Experimental Education, 68(1), 21-
39.
Osman, M. E., & Hannafin, M. J. (1992). Metacognition research and theory: Analysis and implications for
instructional design. Educational Technology Research & Development, 40(2), 83-99.
Parker, A. (2003). Identifying predictors of academic persistence in distance education. Journal of the United States
Distance Learning Association, January 2003.
Pintrich, P. R. (1995). Understanding self-regulated learning. New Directions for Teaching and Learning, 63, 3-12.
Pintrich, P. R., Smith, D. A. F., Garcia, T., & McKeachie, W. J. (1991). A manual for the use of the Motivated
Strategies for Learning Questionnaire(MSLQ). Ann Arbor, Michigan: The University of Michigan.
180
Reeves, T. C. (1993). Pseudoscience in computer-based instruction: The case of learner control research. Journal of
Computer-Based Instruction, 20, 121-135.
Sankaran, S. R., & Bui, T. (2001). Impact of learning strategies and motivation on performance: A study in web-
based instruction. Journal of Instructional Psychology, 28(3), 191-198.
Schunk, D. H., & Zimmerman, B. J. (1998). Self-regulated learning: From teaching to self-reflective practice. New
York, NY: The Guilford Press.
Ulitsky, H. (2000). Language learner strategies with technology. Journal of Educational Computing Research, 22(3),
285-322.
Weinstein, C. E., & Palmer, D. R. (2002). User's Manual Learning and Study Strategies Inventory (LASSI) (2nd
ed.): H & H Publishing Company, Inc.
Williams, M. D. (1993). A comprehensive review of learner-control: The role of learner characteristics. Paper
presented at the Annual Conference of the Association for Educational Communications and Technology,
New Orleans, LA.
Zimmerman, B. J. (1990). Self-regulated learning and academic achievement: An Overview. Educational
Psychologist, 25(1), 3-17.
Zimmerman, B. J. (1998). Developing self-fulfilling cycles of academic regulation: An analysis of exemplary
instructional models. In D. H. Schunk & B. J. Zimmerman (Eds.), Self-regulated learning: From teaching to
self-reflective practice. New York, NY: The Guilford Press.
Zimmerman, B. J. (2000). Attaining self-regulation: A social cognitive perspective. In M. Boekaerts, P. R. Pintrich
& M. Zeidner (Eds.), Handbook of self-regulation. San Diego, CA: Academic Press.
181
Will Self-Regulated Learning Strategy Training Be Useful In a Web-Based
Distance Education Course?
Haihong Hu
Florida State University
Web-based distance instruction has become a popular approach for education. The purpose of this study is
to examine the effects of self-regulated learning (SRL) strategy training on students’ performance, motivation and
strategy use in a web-based course. This study may enable the researcher to prove whether SRL strategies are
teachable in a distance learning environment; and to demonstrate whether this kind of training can be effective for
accomplishing superior outcome and completion rate.
182
and even student persistence may be enhanced with well-designed online instruction with an emphasis on goal-
related behaviors. Since students must take the responsibility for their own time management, skill building, and
eventual academic success, self-regulated learning strategies might be one of the keys to resolve the issues of high
attrition rate and low motivation in web-based instruction.
Over the last decade, learners’ self-regulation of their cognition, motivation, and behaviors to promote
academic achievement has been a topic receiving increasing attention in the field of education (Zimmerman, 1989,
1990, 1998). Driscoll (2000) refers to self-regulation as skills that learners use to “set their own goals and manage
their own learning and performance”(p. 304). Learners who reported more extensive use of self-regulated learning
strategies, such as goal setting and self-evaluation, demonstrated higher academic achievement than learners who
used same strategies less often (Zimmerman & Martinez-Pons, 1986). It seemed that self-regulated learning
strategies did play an important role in students’ academic achievement (Zimmerman, 1990).
Zimmerman (2000) defines self-regulation as “the self-generated thoughts, feelings, and actions that are
planned and adapted cyclically to the attainment of personal goals”. He describes this concept as cyclical because
learners use the feedback from prior performance to make adjustments during their current efforts.
Pintrich (1995) emphasizes the regulation of three general aspects of learning in his interpretation of self-
regulated learning. First, learners self-regulate their behavior including the control of various resources, such as
time, study environment, and students’ use of others such as peers and faculty members to ask for help (Garcia and
Pintrich, 1994; Pintrich, Simth, Garcia, and McKeachie, 1993) as cited in (Pintrich, 1995); Second, learners self-
regulate motivation and affect through controlling and modifying motivational beliefs such as efficacy and goal
orientation to adapt to the demands of a course; Third, learners self-regulate their use of various cognitive strategies
to achieve learning outcomes (Pintrich, 1995).
Published studies that examine the effects of self-regulated learning on achievement and motivation in
various learning domains include the work of Albert Bandura, Dale Schunk, Barry Zimmerman and others. The
literature supports the notion that learners’ self-regulation is a powerful predictor for their academic achievement
(Ley & Young, 1998; Pintrich & Groot, 1990). In addition, self-regulation of learning progress is found to have a
positive effect on learners’ motivation (Kitsantas, Reiser, & Doster, 2003; Lan, 1996; Schunk, 1996; Zimmerman &
Kitsantas, 1996) and perception of responsibility (Garcia & Pintrich, 1991; Heo, 1998).
New learning environments, such as web-based instruction, require proactive and active learning to
construct knowledge and skills. As Schunk & Zimmerman (1998) mentioned that “an area that lends itself well to
self-regulation is distance learning, where instruction originates at one site and is transmitted to students at distant
sites… Self-regulation seems critical due to the high degree of student independence deriving from the instructor’s
physical absence”(p. 231-232). One critical feature of active learners is the responsibility for learning. Therefore,
learner responsibility for learning and ownership must increase for learners to achieve academic success. It was
found that engagement in self-regulated learning could increase learner perception of responsibility (Garcia &
Pintrich, 1991; Heo, 1998). Therefore, one way to increase learner responsibility is to develop self-regulated
learning strategies and engage them in self-regulated learning activities.
Self-regulated learning is appropriate to the college context. Most college students have more control over
their own time and schoolwork and need to decide how they actually carry out studying (Pintrich, 1995), and
traditional academic environment rarely encourages the use or development of self-regulatory skills (Orange, 1999).
At the same time, many college students have difficulty managing this freedom in terms of the amount of time they
spend on learning and the quality of cognitive effort they exert because “they had few opportunities to become self-
regulated in their elementary and secondary school years, and as a result, they have few if any self-regulatory skills
and strategies” (Orange, 1999, p. 36-37). It may be hard for busy college students, especially those with job and
family responsibilities, to find time to learn to use self-regulation strategies. Not surprisingly, they are less motivated
than students with fewer responsibilities. This is why it is suggested (Orange, 1999) that self-regulation strategies
should be taught at all levels of education and research on self-regulated learning might be more relevant to college
students (Pintrich, 1995).
Fortunately, Pintrich (1995) pointed out that “self-regulated learning is teachable”(p. 7-9). Teachers can
teach students to become self-regulated learners, while “Students can learn to be self-regulated.” Students learn self-
regulated learning strategies through experience and self-reflection. Pintrich claimed, “Self-regulated learning is
controllable. Students can control their behavior, motivation and affect, and cognition in order to improve their
academic learning and performance.” He believed that “students should accept responsibility for their own learning
and realize that they have the potential to control their own learning”(Pintrich, 1995, p. 7-9).
Self-regulation has been identified as one of the important mechanisms that may promote learner
achievement and motivation, which are the major determinants to completion rate and satisfaction in web-based
distance learning courses. But few studies have empirically examined the role of self-regulated learning strategy
183
training in on-line courses. The underlying theoretical relationship between self-regulation and achievement and
motivation in the web-based environment has not been extensively explored. This deficiency of empirical studies
might have been caused by the fact that online instruction is comparatively a new phenomenon, or the requirements
for online technical competencies to design, develop and implement experimental treatments.
Method
The primary purpose of this study is to examine the effects of self-regulated learning strategy training on
students’ learning performance, self-efficacy, self-satisfaction and strategy use in a web-based academic course.
The following specific research questions were explored in this study:
Question 1. Will self-regulated learning strategy training in a web-based distance learning course influence
learning performance?
Question 2. Will self-regulated learning strategy training in a web-based distance learning course influence
learner motivation (task value, self-efficacy and self-satisfaction)?
Question 3. Will self-regulated learning strategy training in a web-based distance learning course influence
learners’ strategy use?
Participants
This pilot study occurred at a Southeastern University in Summer term 2005 with 12 volunteers from one
Information Studies major on-line course. These volunteers have an average age of 24.58, average GPA of 2.89,
average registration for 9 credit hours, and they spent an average of 4.38 hours weekly to study for this course,
worked for pay for an average of 20-30 hours per week, used computer for an average of 14-15 hours weekly. Six of
these volunteers are seniors, 5 juniors and 1 sophomore. Eight of them are females and 4 males. Five of them are
African-Americans, 1 Asian-American, 3 Caucasians, 2 Hispanics, and 1 from other ethnic background. Seven of
these volunteers had never taken online courses before, 3 had taken 1, one had taken 2, and one had taken 4. Ten of
them did not take on any volunteer work, and 2 of them offered 1-10 hours per week. Seven of them had no family
responsibilities to affect their time for studying for this course. Three of them paid tuition for this course by
themselves, 7 people relied on financial assistance, and 2 people’s tuition were paid by parents or grandparents.
Eleven of the volunteers thought they were competent with PC and MAC, 11 with Internet, 12 with Email, 8 with
asynchronous and 7 with synchronous discussion.
Because the major purposes of the pilot study were to test the validity of experimental materials and
procedures, and there were only limited number of student volunteers, the researcher assigned all volunteers to the
experimental condition at the beginning of the intervention. Unfortunately, there was still a very high attrition rate of
participants during the process of the study and only 5 of them continued until the end.
Instructional Course
Beginning beyond the computer literacy level, the Technologies for Information Services course focuses on
the application of computer hardware, software, and information systems for the provision of information services.
It highlights features and offers up-to-date coverage of technical developments with examples of real-world software
applications. It also examines the principles by which computer systems and their networks support information
seekers.
Students’ grades were made up of 1) participation (10%), 2) weekly activities (50%), 3) project resource
list (10%) and 4) final project (30%). Weekly assignments were assessed according to the criteria that were included
in each assignment description. In addition, points were deducted for reasons including (but not limited to) the
following: sloppiness; excessive typographical errors or misspellings; egregious grammatical errors; failure to
adhere to assignment requirements; lack of attention to detail. Weekly activities were not accepted after due date as
full credit. A 20% late penalty was assessed. All grades were given either on a 4-point scale or on a 100-point scale.
Intervention Materials
This self-regulated learning strategy training includes two sections: a web-based tutorial on self-regulated
learning strategies and a strategy application practice using sets of online forms.
The purpose of this web-based tutorial is to teach self-regulated learning strategies. The content of this
tutorial focuses on what self-regulated learning strategies are, more specifically, what metacognitive, motivational,
cognitive and resource-management strategies are, as well as examples of them. Besides introducing different types
of self-regulated learning strategies, this tutorial also introduces when and how to use them.
In addition to the knowledge about self-regulated learning strategies, this tutorial also provides participants
with opportunities to practice the knowledge about strategies that they just learned.
184
After completing the practice on the knowledge about self-regulated learning strategies, learners were
required to actually apply the strategies in their studying of the academic content of the course. The main purpose of
this practice was for learners to apply the strategic planning and self-evaluation strategies, through the use of online
forms, to specific tasks within a study period (such as 2 weeks).
Procedures
It took 7 weeks in one semester of the course to complete the experiment of this study due to the nature of
distance education environment and content of training (learning strategies). The original instruction of the course
was used for the purpose of the experiment, and the experiment did not affect the content of the instruction by any
means.
This study consisted of 4 phases. In Phase 1, the researcher solicited participation and collected initial data,
including demographic information, task value, self-efficacy and level of strategy use through the use of online
forms. This phase lasted for 1 week.
In Phase 2, the participants went through the web-based SRL strategy tutorial. Participants’ attendance in
the tutorial was measured using ungraded exercises for each section of the tutorial. Participant’s knowledge of SRL
strategies was measured using a graded test included in the tutorial. This phase also lasted for 1 week.
In Phase 3, the participants took part in the strategy application practice. Participants practiced strategic
planning and self-evaluation strategies using the study plan and self-evaluation online forms. This phase lasted for
about 2 weeks, which is considered as a study period.
In Phase 4, the researcher conducted an Evaluation of Outcome. Post-experiment data, including task value,
self-efficacy, self-satisfaction and level of strategy use were measured. Data on performance of learning (assignment
scores) were also collected for analysis. This phase lasted for about 0.5 week.
Results
Results are discussed below by student learning performance, motivation and strategy use.
Learning Performance
185
The mean scores for cumulative assignment grades were 15.60 for the 5 participants in the experimental
group, and 12.35 for the 20 participants in the control group, who made up the remaining of the class but did not
take part in any activity of the study.
The Mann-Whitney Test conducted on the cumulative assignment grades yielded a non-significant result,
U= 37.00, p> .05. This test revealed that the difference in the cumulative assignment grades between the
experimental and control group was not statistically significant.
Student Motivation
The variable of student motivation is made up of measures of 4 separate items: students’ task value, self-
efficacy, goal-orientation, and self-satisfaction. Among these items, students’ task value, self-efficacy, goal-
orientation were measured in both pre- and post-surveys. Self-satisfaction was measured during both Phase 3 and
Phase 4 of the experiment. Only data from the 5 voluntary participants in the experimental condition were analyzed
to examine whether there is any difference in these variables between the pre- and post-experiment situation.
Task value. The mean scores for task value were 26.60 for the pre-experimental situation, and 26.20 for the
post-experimental situation. The Wilcoxon Signed Rank Test conducted on the task value yielded a non-significant
result, Z=-.368, p> .05. This test revealed that the difference in task value between the pre-and post-experimental
situations was not statistically significant.
Self-efficacy. The mean scores for self-efficacy were 36.20 for the pre-experimental situation, and 36.40
for the post-experimental situation. The Wilcoxon Signed Rank Test conducted on the self-efficacy yielded a non-
significant result, Z=-.447, p> .05. This test revealed that the difference in self-efficacy between the pre-and post-
experimental situations was not statistically significant.
Intrinsic goal orientation. The mean scores for intrinsic goal orientation were 14.60 for the pre-
experimental situation, and 15.80 for the post-experimental situation. The Wilcoxon Signed Rank Test conducted on
the intrinsic goal orientation yielded a significant result, Z=-2.12, p< .05. This test revealed that the difference in
intrinsic goal orientation between the pre-and post-experimental situations was statistically significant.
Extrinsic goal orientation. The mean scores for extrinsic goal orientation were 16.00 for the pre-
experimental situation, and 15.00 for the post-experimental situation. The Wilcoxon Signed Rank Test conducted on
the extrinsic goal orientation yielded a non-significant result, Z=-1.34, p> .05. This test revealed that the difference
in extrinsic goal orientation between the pre-and post-experimental situations was not statistically significant.
Self-satisfaction. The mean scores for self-satisfaction were 8.20 during Phase 3, and 9.20 during Phase 4.
The Wilcoxon Signed Rank Test conducted on the self-satisfaction yielded a non-significant result, Z=-1.63, p> .05.
This test revealed that the difference in self-satisfaction between Phase 3 and Phase 4 was not statistically
significant.
Strategy Use
The variable of strategy use is a cumulative score of students’ use of cognitive, metacognitive, and
186
resource-management strategies. These items were measured in both pre- and post-surveys. Only data from the 5
voluntary participants in the experimental condition were analyzed to examine whether there is any difference in this
variable between the pre- and post-experiment situation.
The mean scores for strategy use were 177.00 for the pre-experimental situation, and 172.00 for the post-
experimental situation. The Wilcoxon Signed Rank Test conducted on strategy use yielded a non-significant result,
Z=-.73, p> .05. This test revealed that the difference in strategy use between the pre-and post-experimental
situations was not statistically significant.
Discussion
This study examined the effects of self-regulated learning strategy training in a web-based distance learning
course on learning performance, learner motivation (task value, self-efficacy and self-satisfaction), and learners’
strategy use. Self-regulated learning strategy training did not significantly influence student performance, most items
(task value, self-efficacy, extrinsic goal orientation and self-satisfaction) of learner motivation, and strategy use.
However, students’ intrinsic goal orientation was significantly changed during the process of the training.
The first research question asked if self-regulated learning strategy training in a web-based distance
learning course would influence learning performance. Students who received self-regulated learning strategy
training did produce higher mean cumulative assignment grades than those who did not, but the difference was not
significant enough to distinguish the two conditions.
Performance is one of the outcomes from distance learning courses, and it is one of the major elements
leading to students’ self-efficacy and satisfaction, which are the primary causes to the high impletion rate.
Improving learning achievement creates the possibility for promoting learners’ self-efficacy and satisfaction, and
eventually reducing drop out rate and increasing cost effectiveness of web-based distance learning courses.
Self-regulated learning strategy training is found effective to promote learning achievement from numerous
studies (e.g., Butler, 1998; Graham et al., 1998; Lan, 1996; Schunk, 1996; Weinstein et al., 2000; B. J. Zimmerman
& Kitsantas, 1996). However, different learners may respond differently to this kind of strategy training (Hofer, Yu,
& Pintrich, 1998). The fact that the addition of self-regulated learning strategy training did not significantly
improve learner performance could be attributed to the extreme imbalance in number of participants in the two
conditions and lack of consideration of learners’ existing level of strategy use. It is possible that strategy training
may be less effective to learners who are already highly strategic and self-regulated, and more effective to learners
who are in need of strategies for self-management and self-discipline when taking web-based online courses.
The second research question asked whether self-regulated learning strategy training in a web-based
distance learning course would influence learner motivation (self-efficacy and self-satisfaction). Students who
received self-regulated learning strategy training did report higher mean scores in self-efficacy, self-satisfaction and
intrinsic goal orientation in the post-experiment situation than in the pre-experiment situation, but the differences in
self-efficacy and self-satisfaction were not significant enough to distinguish the two situations, while that in intrinsic
goal orientation was significant enough. Students who received self-regulated learning strategy training reported
lower mean scores in task-value and extrinsic goal orientation in the post-experiment situation than in the pre-
experiment situation, but the differences were not significant enough to distinguish the two situations.
Social cognitive approaches to self-regulated learning (Bandura, 1986; Schunk & Swartz, 1993;
Zimmerman & Martinez-Pons, 1992) have concentrated on self-efficacy as the critical source of students’
motivation. Self-efficacy operates as an antecedent as well as an outcome of self-regulated learning. On the other
hand, self-satisfaction is critical because people participate in things that produce satisfaction and positive affect,
and stay away from things that initiate dissatisfaction and negative affect (Bandura, 1991 as cited in Zimmerman,
2000). These two are crucial elements in learner motivation.
At the same time, several major causes to students’ dropout from distant programs, such as low self-
motivation (Osborn, 2001; Parker, 2003), lack of learning strategies (Chyung, 2001; Kember et al., 1991),
discomfort with individual learning (Fjortoft, 1995), negative attitude towards the course (Kember et al., 1991), low
learner self-efficacy on using the technology for distance education program (Chyung, 2001; Osborn, 2001), are
either directly or indirectly related to learner motivation.
Students’ self-efficacy is correlated with their use of self-regulated strategies (Zimmerman & Martinez-
187
Pons, 1990). It was found (Bouffard-Bouchard, Parent, & Larivee, 1991) that high self-efficacy students
demonstrated a more active control of their time and were more persistent on the task than those with low self-
efficacy. When students have the required cognitive skills to solve the problems, similar levels of self-efficacy are
likely to make similar effects on self-regulation and performance (Bouffard-Bouchard et al., 1991). Thus, enhancing
students’ self-efficacy may increase their use of self-regulated learning strategies and eventually promote their
persistence in web-based online courses.
In line with the social-cognitive model, it is assumed (Hofer et al., 1998) that self-efficacy can be changed
and regulated like other strategies (Schunk, 1994) as cited in (Hofer et al., 1998). Recently it was found (Hofer et al.,
1997) as cited in (Hofer et al., 1998) from a learning to learn course that student grew in their master orientation to
learning, self-efficacy, and value and interest for the course, and declined in test anxiety. In addition, they improved
in their self-reported strategy use. More importantly, correlational analyses have shown that students’ motivational
beliefs, such as mastery goals, efficacy, and interest and value, were positively correlated with their use of cognitive
and self-regulatory strategies (Hofer et al., 1997) as cited in (Hofer et al., 1998). Thus, learners’ self-efficacy has
been proved to be changeable through involvement in strategy training.
Students’ engagement in self-regulated learning behaviors also seems to bring about satisfaction. It was
found that girls who self-recorded (a form of self-monitoring) reported higher degree of satisfaction than those who
did not self-record (Zimmerman & Kitsantas, 1999). Therefore, it is reasonable to assume that students will feel
more satisfied about their learning experience if they participate in the web-based instruction on and self-reflective
practice on self-regulated learning strategies.
This study is intended to improve learners’ self-efficacy and self-satisfaction by engaging them in learning
and implementing self-regulated learning strategies through the web-based instruction and self-reflective practice.
The potential enhancement in learners’ motivation might lead to more active participation and increase in
completion rate.
The third research question asked if self-regulated learning strategy training in a web-based distance learning course
would influence learners’ strategy use. Participants’ responses to the surveys revealed that the difference in strategy
use between the pre-and post-experimental situation was not statistically significant. These results may be
attributable to the fact that there were not enough participants in the study and a self-report measure was used to
collect data.
Strategy use is found to be helpful for developing learner motivation. In a study, Sankaran (2001)
investigated how learning strategies and motivation influence performance in Web and face-to-face lecture settings
of a business information systems course. While motivation is significantly correlated to performance in both web
and face-to-face settings, the relationship is stronger in the Web setting. High motivation is related with the use of
deep learning strategy, and low motivation with undirected strategy (Sankaran & Bui, 2001).
Greene et al. (2003) conducted a study to examine student performance and approaches to study in a CD-
ROM version of a chemical engineering course. Results illustrated that deep cognitive engagement and motivation,
defined in terms of goals and self-efficacy, were significant predictors of success based on two indices of course
performance.
This study hoped to improve learners’ strategy use by engaging them in learning and implementing self-
regulated learning strategies through the web-based instruction and self-reflective practice. The hypothesized
improvement in students’ metacognitive awareness might bring about more frequent use of self-regulated learning
strategies and in term higher learner motivation, and ultimately a rise in completion rate.
A wealth of studies in the field of distance education has been conducted on comparing the outcomes of
instructional media, learner characteristics, learner perceptions and interaction (Simonson, Smaldino, Albright, &
Zvacek, 2000), but few have actually provided facilitation on learning strategies. By exploring the role of self-
regulated strategy training in an on-line course, this study enabled the researcher to prove that self-regulated learning
strategies are helpful in the web-based distance learning environment. However, there is still a lot to do for
educators to prepare students with the necessary knowledge and skills to learn successfully in an independent online
course, to identify guidelines for designing learner-friendly web-based distance learning courses, and ultimately to
develop more effective facilitation approaches for accomplishing optimal achievement, satisfaction and completion
rate. Future research that investigates potentially productive self-regulated learning strategy training in web-based
environment should be cautious with the limitation on the number of volunteers available for study and the complete
reliance on self-report measures. With the rapid growth of E-learning, it becomes increasingly important to identify
a more helpful way to reduce the attrition rate and ultimately realize the desired cost-effectiveness of on-line
distance learning courses.
References
188
Atman, C. S. (1990). Psychological type elements and goal accomplishment style: Implications for distance
education. In Contemporary issues in American distance education. Oxford: Pergamon Press.
Bandura, A. (1986). Social foundations of thought and action: A social cognitive theory. Upper Saddle River, NJ
07458: Prentice-Hall, Inc. A Pearson Education Company.
Bouffard-Bouchard, T., Parent, S., & Larivee, S. (1991). Influence of self-efficacy on self-regulation and
performance among junior and senior high-school age students. International Journal of Behavioral
Development, 14(2), 153-164.
Butler, D. L. (1998). A strategic content learning approach to promoting self-regulated learning by students with
learning disabilities. In D. H. Schunk & B. J. Zimmerman (Eds.), Self-regulated learning : From teaching
to self-reflective practice. New York, NY: The Guilford Press.
Chyung, S. Y. (2001). Systematic and systemic approaches to reducing attrition rates in online higher education. The
American Journal of Distance Education, 15(3), 36-49.
Driscoll, M. P. (2000). Psychology of learning for instruction. Boston, MA: Allyn & Bacon Publishers.
Eom, W. (1999). The effects of self-regulated learning strategy on academic achievement in a computer-networked
hypertext/hypermedia learning environment. Florida State University, Tallahassee.
Fjortoft, N. F. (1995). Predicting persistence in distance learning programs. Paper presented at the Mid-Western
Educational Research Meeting, Chicago, IL.
Garcia, T., & Pintrich, P. R. (1991). Student motivation and self-regulated learning: A LISREL model. Paper
presented at the American Educational Research Association.
Graham, S., Harris, K. R., & Troia, G. A. (1998). Writing and self-regulation: Cases from the self-regulated strategy
development model. In D. H. Schunk & B. J. Zimmerman (Eds.), Self-Regulated learning: From teaching
to self-reflective practice. New York, NY: The Guilford Press.
Greene, B. A., Dillon, C., & Crynes, B. (2003). Distributive learning in introductory Chemical Engineering:
University students' learning, motivation, and attitudes using a CR-ROM.Unpublished manuscript.
Heo, H. (1998). The effect of self-regulated learning strategies on learner achievement and perceptions on personal
learning responsibility.Unpublished manuscript, Tallahassee, FL.
Hofer, B. K., Yu, S. L., & Pintrich, P. R. (1998). Teaching college students to be self-regulated learners. In D. H.
Schunk & B. J. Zimmerman (Eds.), Self-Regulated learning: From teaching to self-reflective practice. New
York, NY: The Guilford Press.
Kember, D., Murphy, D., Siaw, I., & Yuen, K. S. (1991). Towards a causal model of student progress in distance
education: Research in Hong Kong. The American Journal of Distance Education, 5(2), 3-15.
King, F. B., Harner, M., & Brown, S. W. (2000). Self-regulatory behavior influences in distance learning.
International Journal of Instructional Media, 27(2), 147-155.
Kitsantas, A., Reiser, R., & Doster, J. (2003). Goal setting, cues, and evaluative feedback during acquisition of a
procedural skill: Empowering students' learning during independent practice. Paper presented at the
American Educational Research Association, Chicago, IL.
Lan, W. Y. (1996). The effects of self-monitoring on students' course performance, use of learning strategies,
attitude, self-judgment ability, and knowledge representation. The Journal of Experimental Education,
64(2), 101-115.
Ley, K., & Young, D. B. (1998). Self-Regulation behaviors in underprepared (developmental) and regular admission
college students. Contemporary Educational Psychology, 23, 42-64.
Moore, M. G. (1990). Contemporary issues in American distance education. Oxfords: Pergamon Press.
Orange, C. (1999). Using peer modeling to teach self-regulation. The Journal of Experimental Education,
68(1), 21-39.
Osborn, V. (2001). Identifying at-risk students in videoconferencing and web-based distance education. The
American Journal of Distance Education, 15(1), 41-54.
Parker, A. (2003). Identifying predictors of academic persistence in distance education. Journal of the United States
Distance Learning Association, January 2003.
Pintrich, P. R. (1995). Understanding self-regulated learning. New Directions for Teaching and Learning, 63, 3-12.
Pintrich, P. R., & Groot, E. V. D. (1990). Motivational and self-regulated learning components of classroom
academic performance. Journal of Educational Psychology, 82(1), 33-40.
Pintrich, P. R., Smith, D. A. F., Garcia, T., & McKeachie, W. J. (1991). A manual for the use of the Motivated
Strategies for Learning Questionnaire(MSLQ). Ann Arbor, Michigan: The University of Michigan.
Sankaran, S. R., & Bui, T. (2001). Impact of learning strategies and motivation on performance: A study in web-
based instruction. Journal of Instructional Psychology, 28(3), 191-198.
Schunk, D. H. (1996). Goal and self-evaluative influences during children's cognitive skill learning. American
189
Educational Research Journal, 33, 359-382.
Schunk, D. H., & Swartz, C. W. (1993). Goals and progress feedback: Effects on self-efficacy and writing
achievement. Contemporary Educational Psychology(18), 337-354.
Schunk, D. H., & Zimmerman, B. J. (1998). Conclusions and future directions for academic interventions. In D. H.
Schunk & B. J. Zimmerman (Eds.), Self-Regulated learning: From teaching to self-reflective practice. New York,
NY: The Guilford Press.
Sikora, A. C., & Carroll, C. D. (2002). A profile of participation in distance education: 1999-2000 Postsecondary
education descriptive analysis reports (No. NCES 2003-154): National Center for Education Statistics, U.
S. Department of Education, Office of Educational Research and Improvement.
Ulitsky, H. (2000). Language learner strategies with technology. Journal of Educational Computing Research,
22(3), 285-322.
Weinstein, C. E., Husman, J., & Dierking, D. R. (2000). Self-Regulation intervention with a focus on learning
strategies. In M. Boekaerts, P. R.Pintrich & M. Zeidner (Eds.), Handbook of self-regulation. San Diego,
CA: Academic Press.
Zielinski, D. (2000). The lie of online learning. Training, 37(2), 38-40.
Zimmerman, B. J. (1989). Models of self-regulated learning and academic achievement. In B. J. Zimmerman & D.
H. Schunk (Eds.), Self-regulated learning and academic achievement: theory, research, and practice (pp.
1-25). New York: Springer.
Zimmerman, B. J. (1990). Self-regulated learning and academic achievement: An Overview. Educational
Psychologist, 25(1), 3-17.
Zimmerman, B. J. (1998). Developing self-fulfilling cycles of academic regulation: An analysis of exemplary
instructional models. In D. H. Schunk & B. J. Zimmerman (Eds.), Self-regulated learning: From teaching
to self-reflective practice. New York, NY: The Guilford Press.
Zimmerman, B. J. (2000). Attaining self-regulation: A social cognitive perspective. In M. Boekaerts, P. R. Pintrich
& M. Zeidner (Eds.), Handbook of self-regulation. San Diego, CA: Academic Press.
Zimmerman, B. J., & Kitsantas, A. (1996). Self-regulated learning of a motoric skill: The role of goal setting and
self-monitoring. Journal of Applied Sport Psychology, 8, 60-75.
Zimmerman, B. J., & Kitsantas, A. (1999). Acquiring writing revision skill: Shifting form process to outcome self-
regulatory goals. Journal of Educational Psychology, 91(2), 241-250.
Zimmerman, B. J., & Martinez-Pons, M. (1986). Development of a structured interview for assessing student use of
self-regulated learning strategies. American Educational Research Journal, 23(4), 614-628.
Zimmerman, B. J., & Martinez-Pons, M. (1990). Student differences in self-regulated learning: Relating grade, sex,
and giftedness to self-efficacy and strategy use. Journal of Educational Psychology, 82(1), 51-59.
Zimmerman, B. J., & Martinez-Pons, M. (1992). Perceptions of efficacy and strategy use in the self-regulation of
learning. In D. H. Schunk & J. L. Meece (Eds.), Student perceptions in the classroom: Causes and
consequences (pp. 185-207). Hillsdale, NJ: Erlbaum.
190
Toward Improving Information Technology Acceptance
Through Contextual Training – A Research Framework
Ioan Gelu Ionas
University of Missouri - Columbia
Abstract
Ever since the introduction of information technology (IT), organizations have struggled with the issues of
accepting and using it in the workplace. Toward this end, organizations use important financial and human
resources on training to improve effectiveness in IT utilization. However, despite the efforts organizations make to
plan and implement training programs, many employees still show reluctance in accepting and using these
technologies. Existing research (Davis, 1989; Venkatesh, 2000; Venkatesh & Davis, 2000) shows various factors
that explain technology acceptance in organizations and provides ways of predicting it but does not help
organizations improve it. This research represents an attempt to anchor pre- and post-implementation training
activities in the context of the end-user’s everyday work life to improve technology acceptance in organizations.
Using the Technology Acceptance Model (TAM) as a measurement tool, this research proposes the use of
Activity Theory to analyze and describe the employee’s workplace and to design a training environment and
training activities to improve technology acceptance by overcoming end-user acquired preconceptions about the
planned IT implementation or upgrade in the workplace.
Activity Theory
Activity theory can be viewed as “a philosophy and cross-disciplinary framework for studying different
forms of human activity” (Kariq Kuutti, 1996). In essence, AT is a form of socio-cultural analysis (Nardi, 1996)
combined with a theory of mediated action (Kari Kuutti, 1991; Wertsch, 1998) which holds that the relations within
an activity are not direct ones but are mediated by a variety of artifacts. Historically activity theory is rooted in the
191
works of the classical German philosophers Kant and Hegel, the dialectical materialism of Marx and the works of
Russian psychologists Vygotsky, Leont’ev, and Luria (Kariq Kuutti, 1996).
According to Leont’ev (1974), “activity is a nonadditive molar unit of life for the material, corporeal
subject”. Therefore, activities are the basic units of human life and thus the basis for the study of contextuality. In
reality, an individual can participate in several activities at any given time (Kari Kuutti, 1991). Kuutti (1991)
describes an activity in terms of the following properties: 1) an activity has a motive (object) which tells us why the
activity exists; 2) it is a collective phenomenon; 3) it has a subject (individual or collective); 4) it exists in a material
environment which is in turn transformed by it; 5) it is a historically developing phenomenon; 6) the forces driving
the development of an activity are based on contradictions; 7) it is performed through conscious and purposeful
actions of its participants.
Engeström (1987), pursuing the idea of mediation in human activity systems, further develops Lenont’ev’s
theory by replacing the main binary relationships with mediated ones. In this way, the central relationship between
the subject and the object of the activity becomes a relationship mediated by one or more tools. Still, this simple
structure is not yet adequate to support a systemic analysis of the systemic relationships of an individual with his/her
environment. In consequence, a third main component, the community, was added. Its relationships with the subject
and the object are also mediated, by rules in the first case and by the division of labor in the second case. The
resulting activity system is represented as shown in Figure 1:
Tools
Transformation
process
Production
Consumption
Exchange Distribution
192
Outcome – represents the intention of the activity system.
Further analysis shows that an activity system is composed of four main subsystems: production, consumption,
exchange, and distribution. The top triangle, the production subsystem, formed between subject, tools, and object, is
interested in the process through which the subject acts on the object using the available tools to produce the
outcome. It is generally regarded as being the most important of the four subsystems of an activity system (Jonassen,
2000).
The consumption subsystem describes how the subject and the community of which the subject part of act
together on the object. Because the subject can be part of many activity systems, the consumption subsystem
represents a fundamental contradiction as the subject and the community have to divide their efforts between the
many activities they are involved in.
The distribution subsystem shows how the activities are divided among subjects and community based on
social laws and expectations (division of labor), both horizontally, as a division of tasks, and vertically as a division
of power and status.
The exchange subsystem both regulates the activities in terms of personal needs and defines the rules that
constrain the activity and the community with which the subject interacts. It creates the framework for defining the
work culture and climate for all individuals and groups involved in an activity system.
The Contextual Training and Learning framework and Activity Theory provide the rationale for
contextualizing technical training and the means of doing it. The remaining question is about how technical training
can be anchored in the realm of information technology acceptance. For this purpose the present research proposes
the use of a third theoretical framework, the Technology Acceptance Model (Davis, 1989).
TAM, shown in Figure 2, proposes that the user’s perceptions of technology usefulness and of its ease-of-
use are the main determinant factors of the user’s attitude toward using the system. Consistent with the theory of
reasoned action, behavioral intentions to use are determinants for the attitudes toward using the system which,
according to the TAM model, at their turn determine the actual system use. In addition to the relationships described
above, TAM proposes a direct relationship between PU and the behavioral intention to use.
Since its first appearance in the research literature in 1989, TAM has been tested and validated in various
contexts, on various technologies, different types of subjects, and across cultures. From a technology point of view
in the last 15 years, studies on the following groups of technologies have been reported:
193
Use of spreadsheets (Chau, 1996; Mathieson, 1991);
Adoption and utilization of personal computers (Moore & Benbasat, 1991; Rose & Straub, 1998;
Thompson et al., 1991);
Use of word processors (Chau, 1996; Davis et al., 1992);
Self-reported and actual use of e-mail, fax and voice mail (Adams et al., 1992; W. W. Chin & Todd, 1995;
Gefen & Straub, 1997; Straub et al., 1997; Straub et al., 1995; Subramanian, 1994; Szajna, 1996;
Venkatesh & Davis, 1994);
Use of geographic decision support systems (GDSS) (Sambamurthy & Chin, 1994);
Intended use of Internet based applications (Venkatesh, 1999);
E-commerce and online shopping (Gefen et al., 2003b; Gefen & Straub, 2000; Olson & Boyer, 2003);
Intended adoption of a new enterprise resource planning system (Gefen, 2000);
Researchers in the field also made attempts to extend the initial Technology acceptance model to account for a
variety of other intrinsic and extrinsic factors such as:
Mood (Venkatesh & Speier, 1999);
Intrinsic and extrinsic motivation (Venkatesh & Speier, 1999; Venkatesh et al., 2002);
Cultural differences such as uncertainty avoidance and power distance (Zakour, 2004);
Psychological attachment (Malhotra & Galletta, 1999);
Performance and social expectations (Lee et al., 2003);
Time span (Chau, 1996);
Social awareness (Bjorn et al.);
Computer attitude and self-efficacy (Chau & Hu, 2001; Compeau & Higgins, 1995; Igbaria & Iivari, 1995;
Yi & Hwang, 2003);
Perceived user resources (Wayne W. Chin et al., 2001);
Trust (Dahlberg et al., 2003; Gefen et al., 2003a; Gefen et al., 2003b);
Task-technology fit constructs (Dishaw & Strong, 1999);
Gender differences (Gefen & Straub, 1997);
Age differences (Morris & Venkatesh, 2000).
Gefen & Straub (2000) shows that Davis’ TAM model is stable and has high predictive power. Therefore, the
survey-based research instrument used in TAM research to measure acceptance has also been proven stable and
adaptable to various contexts.
Despite all these efforts, TAM is a descriptive/predictive model which does not provide a framework that
can be used by organizations to overcome or at least reduce information technology acceptance issues during IT
implementation or upgrade. Therefore in the context of this research, which proposes the use of contextualized
training to approach IT acceptance issues, the TAM model will inform the Activity Theory framework, while the
TAM survey tool will be used to measure and predict technology acceptance of the IT end users as a mean to
evaluate the effectiveness of the resulted training activities.
194
The Cognitive Process Dimension
Remember Understand Apply Analyze Evaluate Create
Recognize Interpret Execute Differentiate Check Generate
Recall Exemplify Implement Organize Critique Plan
Classify Attribute Produce
Summarize
Infer
Compare
Explain
Declarative/ Understand
The Knowledge Dimension
Factual
Knowledge Factual What?
Structural/
Conceptual
Knowledge Conceptual What? Structural Why?
Procedural
Knowledge Procedural How?
Meta-
cognitive Reflective
Knowledge Synergy
Returning to the research on training done in the TAM area one can see that at the most, current training
practices deliver declarative/factual and procedural knowledge to the trainees. Based on the presented taxonomy, by
leaving out the structural/conceptual knowledge the trainees are missing essential cognitive abilities related to
understanding, analyzing and evaluating the technology and its impact.
If the declarative/factual and procedural knowledge is more about the use of the information technology,
the structural/conceptual knowledge is fully contextualized and provides the trainees with the ability to understand
the place the new technology and their role, as IT users, in the organizational business processes based on a larger
organizational picture.
using intention
Procedural Perceived
Knowledge Ease of Use
Actual
Metacognitive use
Knowledge
Considering this assertion, this research will attempt to develop methods to contextualize training by
redesigning the current training activities to incorporate structural/conceptual knowledge. In the context offered by
the TAM model, this approach would provide support for the Perceived Usefulness determinant of the end-users'
attitudes. The resulting theoretical model is shown in figure 4.
195
In Figure 4 the dotted lines show existing research, while the continuous ones represent the proposed
relationships and effects. The line thickness indicates the strength of the relationship. Considering the proposed
model, it is expected that the structural/conceptual knowledge would influence primarily PU, but will also have a
lower impact on PEoU also.
The forth knowledge dimension, metacognition is an elusive concept, very difficult to measure. Also, due
to various restrictions, such as time limitations, it would be difficult to help the trainees build metacognitive abilities
during training. Therefore, the influence of metacognition on attitudes and behaviors will not be studied.
Research Plan
Current negotiations with the training department of the target organization provide some bounding
conditions and support for this research:
The organization is currently delivering short (up to two days) training sessions, of which many span over three to
four hours;
Only up to 15 individuals are being trained in one session;
If the number of trainees is larger than 15, separate training sessions will be offered;
The organization decides what topic the training will have and when & where this training will be
delivered;
Diverse training topics are available, ranging from the use of Microsoft Outlook to manage e-mail
messages to the use of specific module of the organization-wide ERP (Enterprise Resource Planning)
software;
Many training sessions are repeated at various intervals to cover the employees that were recently hired.
On one side, the shortness of the training sessions will make it almost impossible to approach at least one
of the research model components, the metacognition. On the other side, short, repetitive training sessions on the
same subject, will provide the opportunity to use a quasi-experimental approach, in which some of the training
sessions will not receive any treatment, while for others the treatment will be present. The same research process
will be applied to both groups.
On a larger scale, this situation also provides support for the use of a specific research methodology -
design based research (Jonassen et al., 2006). The assumption of design-based research is that instructional design
should fundamentally be a research process rather than a procedural process. For the purpose of this study, the
design based research approach is used toward reaching a potential normative training design process that is thought
to be attainable as an outcome of multiple iterations.
The small number of trainees that usually join a training session makes possible the use of a wider variety
of training interventions, such as hands-on training, one-to-one conversations etc., compared to what would be
feasible for a larger group of people.
In addition, based on some initial observations of a couple of training sessions and on the interview with
the training department manager, the currently delivered training can be clearly classified as procedural. Moreover,
the observations show that the majority of the questions asked by the trainees were mainly related to
structural/conceptual knowledge (“who will input this information if I’m not the one doing it?”, “What reports are
managers running based on the information I’m entering into the system?”). Therefore, without any attempt to
generalize to other organizations, the proposed research model fits well in the existing context.
196
In an attempt to find an answer to the research questions, the following timeline has been developed (Figure 5).
Note: The Technology Acceptance Model instrument will be used for assessment in pre- and post-tests.
Three data collection methods are being considered: observations, interviews, and focus groups.
Observations (Patton, 2002, p. 21) will be conducted during the training session(s) and will focus on the added
training intervention and trainees’ questions. All training sessions will be videotaped.
Research Snapshot using AT Training Training delivery Snapshot
Steps design using AT
Assessment
1-2 Mo
Objective(s) Find existing 1. Find potential Contextualized Deliver the Assessment Find systemic
& Outcomes systemic effects of the training contextualized of training tensions
tensions new technology. program training effectiveness after IT
2. Find what is based on implementation
addressed by AT analysis
existing training. results
Semi-structured interviews (Patton, 2002, p. 343) will be conducted face-to-face in both the initial and final stages
of the research. The interview will be videotaped or audio taped, depending on the location. The interview will
approach two categories of topics. One category is aimed at positioning the subject in his or her activity system,
relative to a known set of parameters resulted from existing research such as motivation, performance, expectations,
etc. The second category is aimed at understanding more specific, contextual factors.
Focus groups (Patton, 2002, p. 385) will be used in both training design activity and as part of the training
assessment process. The activities of the focus groups will be videotaped. Several focus groups will be organized to
design the training.
A second use of the focus group will be conducted to assess training effectiveness. Because assessment is
dependent on the training intervention, it will be developed together with the training design activities. Therefore,
any instruments used in the training assessment focus group will be developed at the same time too and are not
available at the time of this writing.
Due to the nature of the research and to the variety of theories and frameworks involved in its development,
the unit of analysis will vary depending on the research activity. During some of the research steps, the use of the
Activity Theory suggests the employee’s activity system as the main unit of analysis, while for the assessment of the
training outcomes for example the individual will be the focus of the research activities.
Sample and sample size is also dependent on the research activity. Overall, the total number of employees
in the target organization represents the population considered for this research. Sample selection is bounded by
certain limitation the researchers face at the target organization. Considering the bounding conditions discussed in
the research context subchapter, a combination of purposeful typical case and convenience sampling methodologies
(Patton, 2002, p. 236, 241) is considered.
The participants in the training sessions and in the study will be determined by the training schedule
designed by the organization at the time of the research. From the employees that will be in the training session the
researchers will purposefully select not more than a couple individuals for the initial part of the research, the
analysis of the employee’s activity system so that the subjects will be representative for the larger population.
197
researcher in the situation of developing more comprehensive (and in the end more vague) instruments.
Also, since the focus of the training and its context are unknown, data analysis (steps 1 & 2) and the
development of the training intervention will have to take place in a very limited time slot to maintain at minimum
the perturbations the normal activities of the training department. In this context, a pilot study might be feasible to
be conducted to develop the initial analysis instruments (e.g. basic coding schemes)
The trainers’ ability to apply the treatment effectively to the trainees has to be considered also. A possible
solution to this problem is to have the trainers participate in the training development process to help them better
understand the intended outcomes. In addition, their commitment to the idea of this research and training
interventions needs to be secured.
Many other issues can be also considered, such as access to the training sessions and the subjects,
especially for those employees that work with sensitive information, the researcher’s interviewing abilities, or any
potential language barriers in this particular case.
In the end, one important issue needs to be approached – the extent of the research, both in time and
resources. Therefore, a pilot study will be run first, which if shows potential validity of the proposed model, will be
the foundation for further resource allocation.
References
Adams, D. A., Nelson, R. R., & Todd, P. A. (1992). Perceived usefulness, ease of use, and usage of information
technology - a replication. MIS Quarterly, 16(2), 227-247.
Armitage, C. J., & Conner, M. (2001). Efficacy of the theory of planned behavior: A meta-analytic review. British
Journal of Social Psychology, 40(4), 471-499.
Berns, R. G., & Erickson, P. M. (2001). Contextual teaching and learning: Preparing students for the new economy.
The Highlight Zone. Research @ Work, 5, 2-8.
Bjorn, P., Fitzgerald, B., & Scupola, A. The role of social awareness in technology acceptance of groupware in
virtual learning.
Chau, P. Y. K. (1996). An empirical assessment of a modified technology acceptance model. Journal of
Management Information Systems, 13(2), 185.
Chau, P. Y. K., & Hu, P. J. H. (2001). Information technology acceptance by individual professionals: A model
comparison approach. Decision Sciences, 32(4), 699-719.
Chin, W. W., Marcolin, B. L., Mathieson, K., & Peacock, E. (2001). Adoption, diffusion, and infusion of it -
extending the technology acceptance model: The influence of perceived user resources. Data Base, 32(3),
86.
Chin, W. W., & Todd, P. A. (1995). On the use, usefulness, and ease of use of structural equation modeling in mis
research - a note of caution. MIS Quarterly, 19(2), 237-246.
Compeau, D. R., & Higgins, C. A. (1995). Computer self-efficacy: Development of a measure and initial test. MIS
Quarterly, 189-211.
Dahlberg, T., Mallat, N., & Öörni, A. (2003). Trust enhanced technology acceptance model - consumer acceptance
of mobile payment solutions. Tentative evidence.
Davis, F. D. (1989). Perceived usefulness, perceived ease of use, and user acceptance of information technology.
MIS Quarterly, 13(3), 319-340.
Davis, F. D., Bagozzi, R. P., & Warshaw, P. R. (1992). Extrinsic and intrinsic motivation to use computers in the
workplace. Journal of Applied Social Psychology, 22(14), 1111-1132.
Dishaw, M. T., & Strong, D. M. (1999). Extending the technology acceptance model with task-technology fit
constructs. Information & Management, 36(1), 9-21.
Engeström, Y. (1987). Learning by expanding. Helsinki: Orienta-konsultit.
Engeström, Y. (1999). Perspectives on activity theory. In R. Miettinen & R.-L. Punamaki (Eds.). New York:
Cambridge University Press.
Fishbein, M., & Ajzen, I. (1975). Belief, attitude, intention and behavior: An introduction to theory and research.
Reading, MA: Addison-Wesley.
Gefen, D. (2000). Lessons learnt from the successful adoption of an erp: The central role of trust. In S. Zanakis, G.
Doukidis & C. Szopounidis (Eds.), Recent developments and applications in decision making. Norwood,
MA: Kluwer Academic.
Gefen, D., Karahanna, E., & Straub, D. W. (2003a). Inexperience and experience with online stores: The importance
of tam and trust. IEEE Transactions on Engineering Management, 50(3), 307-321.
Gefen, D., Karahanna, E., & Straub, D. W. (2003b). Trust and tam in online shopping: An integrated model. MIS
Quarterly, 27(1), 51-90.
198
Gefen, D., & Straub, D. (2000). The relative importance of perceived ease of use in is adoption: A study of e-
commerce adoption. Journal of the Association for Information Systems, 1(8), 1-30.
Gefen, D., & Straub, D. W. (1997). Gender differences in the perception and use of e-mail: An extension to the
technology acceptance model. MIS Quarterly, 21(4), 389-400.
Griffith, P. L., & Laframboise, K. (1997). The structures and patterns of case method talk: What our students taught
us. Action in Teacher Education, 18(4), 10-22.
Igbaria, M., & Iivari, J. (1995). Effects of self-efficacy on computer usage. Omega-International Journal of
Management Science, 23(6), 587-605.
Imel, S. (2000). Contextual learning in adult education. Retrieved December 8, 2004, from
www.cete.org/acve/docgen.asp?tbl=pab&ID=102
Jonassen, D. H. (2000). Revisiting activity theory as a framework for designing student-centered learning
environments. In D. H. Jonassen & S. M. Land (Eds.), Theoretical foundations of learning environments.
Mahawah NJ, London: Lawrence Erlbaum Associates, Publisers.
Jonassen, D. H., Cernusca, D., & Ionas, I. G. (2006). Constructivism and instructional design: The emergence of the
learning sciences and design research. In R. A. Reiser & J. A. Dempsey (Eds.), Trends and issues in
instructional design and technology. (2nd ed.). Upper Saddle River, New Jersey: Merrill/Prentice Hall.
Kuutti, K. (1991). Activity theory and its applications to information systems research and development. In H. E.
Nissen (Ed.), Proceedings of the ifip tc8/wg 8.2 working conference on the information systems research
arena of the 90's: Challenges, perceptions, and alternate approaches (pp. 529-549). New York: Elsevier.
Kuutti, K. (1996). Activity theory as a potential framework for human-computer interaction research. In B. A. Nardi
(Ed.), Context and consciousness: Activity theory and human-computer interaction. (pp. 17-44).
Cambridge, MA: MIT Press.
La Pelle, N. (2004). Simplifying qualitative data analysis using general purpose software tools. Field Methods,
16(1), 85-108.
Lave, J., & Wenger, E. (1991). Situated learning: Legitimate peripheral participation. New York, NY: Cambridge
University.
Lee, J.-S., Hichang, C., Geri, G., Davidson, B., & Ingraffea, T. (2003). Technology acceptance and social
networking in distance learning. Educational Technology & Society, 6(2), 50-61.
Leont'ev, A. N. (1974). The problem of activity in psychology. Soviet Psychology, 13(2), 4-33.
Leont'ev, A. N. (1989). The problem of activity in psychology. Soviet Psychologist, 27(1), 22-39.
Malhotra, Y., & Galletta, D. F. (1999). Extending the technology acceptance model to account for social influence:
Theoretical bases and empirical validation. Proceedings of Thirty-Second Annual Hawaii International
Conference on System Sciences, 1.
Mathieson, K. (1991). Predicting user intentions: Comparing the technology acceptance model with the theory of
planned behavior. Information Systems Research, 2(3), 173-191.
Miles, M. B., & Huberman, A. M. (1994). Qualitative data analysis. Thousand Oaks, CA: Sage Publications.
Moore, G. C., & Benbasat, I. (1991). Development of an instrument to measure the perceptions of adopting an
information technology innovation. Information Systems Research, 2(3), 192-222.
Morris, M. G., & Venkatesh, V. (2000). Age differences in technology adoption decision: Implications for a
changing workforce. Personnel Psychology, 53(2), 375-403.
Nardi, B. A. (Ed.). (1996). Studying context: A comparison of activity theory, situated action models, and
distributed cognition. Cambridge, MA: MIT Press.
Olson, J. R., & Boyer, K. K. (2003). Factors influencing the utilization of internet purchasing in small organizations.
Journal of Operations Management, 21(2), 225-245.
Owens, T., Dunham, D., & Wang, C. (1999). Toward a theory of contextual teaching and learning. Portland, OR:
Northwest Regional Educational Library.
Parnell, D. (2000). Contextual teaching works. Waco, TX: Center for Occupational Research and Development.
Patton, M. Q. (2002). Qualitative research & evaluation methods. (3rd ed.). Thousand Oaks CA, London, New
Delhi: Sage Publications.
Rogers, E. (1993). Diffusion of innovation. New York: The Free Press.
Rose, G., & Straub, D. W. (1998). Predicting general it use: Applying tam to the arabic world. Journal of Global
Information Management, 6(3), 39.
Sambamurthy, V., & Chin, W. W. (1994). The effects of group attitudes towards alternative gdss decisions on the
decision-making performance of computer-supported groups. Decision Sciences, 25(2), 863-874.
Sandlin, J. A. (2000). The politics of consumer education materials used in adult literacy classrooms. Adult
Education Quarterly, 50(4), 289-307.
199
Smith, A. J. (2000). The washington state consortium for contextual teaching and learning booklet. Seattle, WA:
Center for the Study and Teaching of At-Risk Students.
Straub, D., Keil, M., & Brenner, W. (1997). Testing the technology acceptance model across cultures: A three
country study. Information & Management, 33(1), 1-11.
Straub, D., Limayem, M., & Karahannaevaristo, E. (1995). Measuring system usage - implications for is theory
testing. Management Science, 41(8), 1328-1342.
Subramanian, G. H. (1994). A replication of perceived usefulness and perceived ease of use measurement. Decision
Sciences, 25(5-6), 863-874.
Szajna, B. (1996). Empirical evaluation of the revised technology acceptance model. Management Science, 42(1),
85-92.
Taylor, S., & Todd, P. A. (1995). Understanding information technology usage - a test of competing models.
Information Systems Research, 6(2), 144-176.
Thompson, R. L., Higgins, C. A., & Howell, J. M. (1991). Personal computing: Toward a conceptul model of
utilization. MIS Quarterly, 15(1), 125-142.
Tornatzky, L. G., & Klein, K. J. (1982). Innovation characteristics and innovation adaptation-implementation: A
meta-analysis of findings. IEEE Transactions on Engineering Management, 29(1), 28-45.
Venkatesh, V. (1999). Creation of favorable user perceptions: Exploring the role of intrinsic motivation. MIS
Quarterly, 23(2), 239-260.
Venkatesh, V. (2000). Determinants of perceived ease of use: Integrating control, intrinsic motivation, and emotion
into the technology acceptance model. Information Systems Research, 11(4), 342-365.
Venkatesh, V., & Davis, F. D. (1994). Modeling the determinants of perceived ease of use. Paper presented at the
Proceedings of the Fifteenth International Conference of Information Systems, Vancouver, British
Columbia.
Venkatesh, V., & Davis, F. D. (2000). A theoretical extension of the technology acceptance model: Four
longitudinal field studies. Management Science, 46(2), 186-204.
Venkatesh, V., & Speier, C. (1999). Computer technology training in the workplace: A longitudinal investigation of
the effect of mood. Organizational Behavior and Human Decision Processes, 79, 1-28.
Venkatesh, V., Speier, C., & Morris, M. G. (2002). User acceptance enablers in individual decision making about
technology: Toward an integrated model. Decision Sciences, 33(2), 297-316.
Wertsch, J. V. (1998). Mind as action. New Youk: Oxford University Press.
Yi, M. Y., & Hwang, Y. (2003). Predicting the use of web-based information systems: Self-efficacy, enjoyment,
learning goal orientation, and the technology acceptance model. International Journal of Human-Computer
Studies, 59(4), 431-451.
Zakour, A. B. (2004). Cultural differences and information technology acceptance. Paper presented at the 7th
Annual Conference of the Southern Association for Information Systems.
200
A Research Framework to Study the Effects of Contextualized
Training on Information Technology Acceptance
Ioan Gelu Ionas
University of Missouri Columbia
Abstract
Ever since the introduction of information technology (IT), organizations have struggled with the issues of
accepting and using it in the workplace. Toward this end, organizations use important financial and human
resources on training to improve effectiveness in IT utilization. However, despite the efforts organizations make to
plan and implement training programs, many employees still show reluctance in accepting and using these
technologies. Existing research (Davis, 1989; Venkatesh, 2000; Venkatesh & Davis, 2000) shows various factors
that explain technology acceptance in organizations and provides ways of predicting it but does not help
organizations improve it. This research represents an attempt to anchor pre- and post-implementation training
activities in the context of the end-user’s everyday work life to improve technology acceptance in organizations.
Using the Technology Acceptance Model (TAM) as a measurement tool, this research proposes the use of Activity
Theory to analyze and describe the employee’s workplace and to design a training environment and training
activities to improve technology acceptance by overcoming end-user acquired preconceptions about the planned IT
implementation or upgrade in the workplace.
Activity Theory
Activity theory can be viewed as “a philosophy and cross-disciplinary framework for studying different
forms of human activity” (Kariq Kuutti, 1996). In essence, AT is a form of socio-cultural analysis (Nardi, 1996)
combined with a theory of mediated action (Kari Kuutti, 1991; Wertsch, 1998) which holds that the relations within
201
an activity are not direct ones but are mediated by a variety of artifacts. Historically activity theory is rooted in the
works of the classical German philosophers Kant and Hegel, the dialectical materialism of Marx and the works of
Russian psychologists Vygotsky, Leont’ev, and Luria (Kariq Kuutti, 1996).
According to Leont’ev (1974), “activity is a nonadditive molar unit of life for the material, corporeal
subject”. Therefore, activities are the basic units of human life and thus the basis for the study of contextuality. In
reality, an individual can participate in several activities at any given time (Kari Kuutti, 1991). Kuutti (1991)
describes an activity in terms of the following properties: 1) an activity has a motive (object) which tells us why the
activity exists; 2) it is a collective phenomenon; 3) it has a subject (individual or collective); 4) it exists in a material
environment which is in turn transformed by it; 5) it is a historically developing phenomenon; 6) the forces driving
the development of an activity are based on contradictions; 7) it is performed through conscious and purposeful
actions of its participants.
Engeström (1987), pursuing the idea of mediation in human activity systems, further develops Lenont’ev’s
theory by replacing the main binary relationships with mediated ones. In this way, the central relationship between
the subject and the object of the activity becomes a relationship mediated by one or more tools. Still, this simple
structure is not yet adequate to support a systemic analysis of the systemic relationships of an individual with his/her
environment. In consequence, a third main component, the community, was added. Its relationships with the subject
and the object are also mediated, by rules in the first case and by the division of labor in the second case. The
resulting activity system is represented as shown in Figure 1:
Tools
Transformation
process
Production
Consumption
Exchange Distribution
202
• Division of labor – refers to both the horizontal division of tasks between the members of the
community and to the vertical division of power and status inside an organization;
• Outcome – represents the intention of the activity system.
Further analysis shows that an activity system is composed of four main subsystems: production, consumption,
exchange, and distribution. The top triangle, the production subsystem, formed between subject, tools, and object, is
interested in the process through which the subject acts on the object using the available tools to produce the
outcome. It is generally regarded as being the most important of the four subsystems of an activity system (Jonassen,
2000).
The consumption subsystem describes how the subject and the community of which the subject part of act
together on the object. Because the subject can be part of many activity systems, the consumption subsystem
represents a fundamental contradiction as the subject and the community have to divide their efforts between the
many activities they are involved in.
The distribution subsystem shows how the activities are divided among subjects and community based on
social laws and expectations (division of labor), both horizontally, as a division of tasks, and vertically as a division
of power and status.
The exchange subsystem both regulates the activities in terms of personal needs and defines the rules that
constrain the activity and the community with which the subject interacts. It creates the framework for defining the
work culture and climate for all individuals and groups involved in an activity system.
The Contextual Training and Learning framework and Activity Theory provide the rationale for
contextualizing technical training and the means of doing it. The remaining question is about how technical training
can be anchored in the realm of information technology acceptance. For this purpose the present research proposes
the use of a third theoretical framework, the Technology Acceptance Model (Davis, 1989).
TAM, shown in Figure 2, proposes that the user’s perceptions of technology usefulness and of its ease-of-
use are the main determinant factors of the user’s attitude toward using the system. Consistent with the theory of
reasoned action, behavioral intentions to use are determinants for the attitudes toward using the system which,
according to the TAM model, at their turn determine the actual system use. In addition to the relationships described
above, TAM proposes a direct relationship between PU and the behavioral intention to use.
203
Since its first appearance in the research literature in 1989, TAM has been tested and validated in various
contexts, on various technologies, different types of subjects, and across cultures. From a technology point of view
in the last 15 years, studies on the following groups of technologies have been reported:
• Use of spreadsheets (Chau, 1996; Mathieson, 1991);
• Adoption and utilization of personal computers (Moore & Benbasat, 1991; Rose & Straub, 1998;
Thompson et al., 1991);
• Use of word processors (Chau, 1996; Davis et al., 1992);
• Self-reported and actual use of e-mail, fax and voice mail (Adams et al., 1992; W. W. Chin & Todd,
1995; Gefen & Straub, 1997; Straub et al., 1997; Straub et al., 1995; Subramanian, 1994; Szajna,
1996; Venkatesh & Davis, 1994);
• Use of geographic decision support systems (GDSS) (Sambamurthy & Chin, 1994);
• Intended use of Internet based applications (Venkatesh, 1999);
• E-commerce and online shopping (Gefen et al., 2003b; Gefen & Straub, 2000; Olson & Boyer, 2003);
• Intended adoption of a new enterprise resource planning system (Gefen, 2000);
Researchers in the field also made attempts to extend the initial Technology acceptance model to account for a
variety of other intrinsic and extrinsic factors such as:
• Mood (Venkatesh & Speier, 1999);
• Intrinsic and extrinsic motivation (Venkatesh & Speier, 1999; Venkatesh et al., 2002);
• Cultural differences such as uncertainty avoidance and power distance (Zakour, 2004);
• Psychological attachment (Malhotra & Galletta, 1999);
• Performance and social expectations (Lee et al., 2003);
• Time span (Chau, 1996);
• Social awareness (Bjorn et al.);
• Computer attitude and self-efficacy (Chau & Hu, 2001; Compeau & Higgins, 1995; Igbaria & Iivari,
1995; Yi & Hwang, 2003);
• Perceived user resources (Wayne W. Chin et al., 2001);
• Trust (Dahlberg et al., 2003; Gefen et al., 2003a; Gefen et al., 2003b);
• Task-technology fit constructs (Dishaw & Strong, 1999);
• Gender differences (Gefen & Straub, 1997);
• Age differences (Morris & Venkatesh, 2000).
Gefen & Straub (2000) shows that Davis’ TAM model is stable and has high predictive power. Therefore, the
survey-based research instrument used in TAM research to measure acceptance has also been proven stable and
adaptable to various contexts.
Despite all these efforts, TAM is a descriptive/predictive model which does not provide a framework that
can be used by organizations to overcome or at least reduce information technology acceptance issues during IT
implementation or upgrade. Therefore in the context of this research, which proposes the use of contextualized
training to approach IT acceptance issues, the TAM model will inform the Activity Theory framework, while the
TAM survey tool will be used to measure and predict technology acceptance of the IT end users as a mean to
evaluate the effectiveness of the resulted training activities.
204
The Cognitive Process Dimension
Remember Understand Apply Analyze Evaluate Create
Recognize Interpret Execute Differentiate Check Generate
Recall Exemplify Implement Organize Critique Plan
Classify Attribute Produce
Summarize
Infer
Compare
Explain
Declarative/ Understand
The Knowledge Dimension
Factual
Knowledge Factual What?
Structural/
Conceptual
Knowledge Conceptual What? Structural Why?
Procedural
Knowledge Procedural How?
Meta-
cognitive Reflective
Knowledge Synergy
Returning to the research on training done in the TAM area one can see that at the most, current training
practices deliver declarative/factual and procedural knowledge to the trainees. Based on the presented taxonomy, by
leaving out the structural/conceptual knowledge the trainees are missing essential cognitive abilities related to
understanding, analyzing and evaluating the technology and its impact.
If the declarative/factual and procedural knowledge is more about the use of the information technology,
the structural/conceptual knowledge is fully contextualized and provides the trainees with the ability to understand
the place the new technology and their role, as IT users, in the organizational business processes based on a larger
organizational picture.
using intention
Procedural Perceived
Knowledge Ease of Use
Actual
Metacognitive use
Knowledge
Considering this assertion, this research will attempt to develop methods to contextualize training by
redesigning the current training activities to incorporate structural/conceptual knowledge. In the context offered by
the TAM model, this approach would provide support for the Perceived Usefulness determinant of the end-users'
attitudes. The resulting theoretical model is shown in figure 4.
205
In Figure 4 the dotted lines show existing research, while the continuous ones represent the proposed
relationships and effects. The line thickness indicates the strength of the relationship. Considering the proposed
model, it is expected that the structural/conceptual knowledge would influence primarily PU, but will also have a
lower impact on PEoU also.
The forth knowledge dimension, metacognition is an elusive concept, very difficult to measure. Also, due
to various restrictions, such as time limitations, it would be difficult to help the trainees build metacognitive abilities
during training. Therefore, the influence of metacognition on attitudes and behaviors will not be studied.
Research Plan
Current negotiations with the training department of the target organization provide some bounding conditions and
support for this research:
• The organization is currently delivering short (up to two days) training sessions, of which many
span over three to four hours;
• Only up to 15 individuals are being trained in one session;
• If the number of trainees is larger than 15, separate training sessions will be offered;
• The organization decides what topic the training will have and when & where this training will be
delivered;
• Diverse training topics are available, ranging from the use of Microsoft Outlook to manage e-mail
messages to the use of specific module of the organization-wide ERP (Enterprise Resource
Planning) software;
• Many training sessions are repeated at various intervals to cover the employees that were recently
hired.
On one side, the shortness of the training sessions will make it almost impossible to approach at least one
of the research model components, the metacognition. On the other side, short, repetitive training sessions on the
same subject, will provide the opportunity to use a quasi-experimental approach, in which some of the training
sessions will not receive any treatment, while for others the treatment will be present. The same research process
will be applied to both groups.
On a larger scale, this situation also provides support for the use of a specific research methodology -
design based research (Jonassen et al., 2006). The assumption of design-based research is that instructional design
should fundamentally be a research process rather than a procedural process. For the purpose of this study, the
design based research approach is used toward reaching a potential normative training design process that is thought
to be attainable as an outcome of multiple iterations.
The small number of trainees that usually join a training session makes possible the use of a wider variety
of training interventions, such as hands-on training, one-to-one conversations etc., compared to what would be
feasible for a larger group of people.
In addition, based on some initial observations of a couple of training sessions and on the interview with
the training department manager, the currently delivered training can be clearly classified as procedural. Moreover,
the observations show that the majority of the questions asked by the trainees were mainly related to
structural/conceptual knowledge (“who will input this information if I’m not the one doing it?”, “What reports are
managers running based on the information I’m entering into the system?”). Therefore, without any attempt to
generalize to other organizations, the proposed research model fits well in the existing context.
206
In an attempt to find an answer to the research questions, the following timeline has been developed (Figure 5).
Note: The Technology Acceptance Model instrument will be used for assessment in pre- and post-tests.
1-2 Mo
Objective(s) Find existing 1. Find potential Contextualized Deliver the Assessment Find systemic
& Outcomes systemic effects of the training contextualized of training tensions
tensions new technology. program training effectiveness after IT
2. Find what is based on implementation
addressed by AT analysis
existing training. results
Three data collection methods are being considered: observations, interviews, and focus groups.
Observations (Patton, 2002, p. 21) will be conducted during the training session(s) and will focus on the added
training intervention and trainees’ questions. All training sessions will be videotaped.
Semi-structured interviews (Patton, 2002, p. 343) will be conducted face-to-face in both the initial and final
stages of the research. The interview will be videotaped or audio taped, depending on the location. The interview
will approach two categories of topics. One category is aimed at positioning the subject in his or her activity system,
relative to a known set of parameters resulted from existing research such as motivation, performance, expectations,
etc. The second category is aimed at understanding more specific, contextual factors.
Focus groups (Patton, 2002, p. 385) will be used in both training design activity and as part of the training
assessment process. The activities of the focus groups will be videotaped. Several focus groups will be organized to
design the training.
A second use of the focus group will be conducted to assess training effectiveness. Because assessment is
dependent on the training intervention, it will be developed together with the training design activities. Therefore,
any instruments used in the training assessment focus group will be developed at the same time too and are not
available at the time of this writing.
Due to the nature of the research and to the variety of theories and frameworks involved in its development,
the unit of analysis will vary depending on the research activity. During some of the research steps, the use of the
Activity Theory suggests the employee’s activity system as the main unit of analysis, while for the assessment of the
training outcomes for example the individual will be the focus of the research activities.
Sample and sample size is also dependent on the research activity. Overall, the total number of employees
in the target organization represents the population considered for this research. Sample selection is bounded by
certain limitation the researchers face at the target organization. Considering the bounding conditions discussed in
the research context subchapter, a combination of purposeful typical case and convenience sampling methodologies
(Patton, 2002, p. 236, 241) is considered.
The participants in the training sessions and in the study will be determined by the training schedule
designed by the organization at the time of the research. From the employees that will be in the training session the
researchers will purposefully select not more than a couple individuals for the initial part of the research, the
analysis of the employee’s activity system so that the subjects will be representative for the larger population.
207
Research Limitations and Concerns
Several concerns and limitations need to be considered, such as the elements that are unknown to the
researcher(s) when developing the research plan (e.g. training focus or specific context), issues related to access, as
well as the extent of the research. As previously explained the researcher does not have knowledge of the specific IT
or section of IT on which the training will be conducted at the time of the research. This particular situation puts the
researcher in the situation of developing more comprehensive (and in the end more vague) instruments.
Also, since the focus of the training and its context are unknown, data analysis (steps 1 & 2) and the
development of the training intervention will have to take place in a very limited time slot to maintain at minimum
the perturbations the normal activities of the training department. In this context, a pilot study might be feasible to
be conducted to develop the initial analysis instruments (e.g. basic coding schemes)
The trainers’ ability to apply the treatment effectively to the trainees has to be considered also. A possible
solution to this problem is to have the trainers participate in the training development process to help them better
understand the intended outcomes. In addition, their commitment to the idea of this research and training
interventions needs to be secured.
Many other issues can be also considered, such as access to the training sessions and the subjects,
especially for those employees that work with sensitive information, the researcher’s interviewing abilities, or any
potential language barriers in this particular case.
In the end, one important issue needs to be approached – the extent of the research, both in time and
resources. Therefore, a pilot study will be run first, which if shows potential validity of the proposed model, will be
the foundation for further resource allocation.
References
Adams, D. A., Nelson, R. R., & Todd, P. A. (1992). Perceived usefulness, ease of use, and usage of information
technology - a replication. MIS Quarterly, 16(2), 227-247.
Armitage, C. J., & Conner, M. (2001). Efficacy of the theory of planned behavior: A meta-analytic review. British
Journal of Social Psychology, 40(4), 471-499.
Berns, R. G., & Erickson, P. M. (2001). Contextual teaching and learning: Preparing students for the new economy.
The Highlight Zone. Research @ Work, 5, 2-8.
Bjorn, P., Fitzgerald, B., & Scupola, A. The role of social awareness in technology acceptance of groupware in
virtual learning.
Chau, P. Y. K. (1996). An empirical assessment of a modified technology acceptance model. Journal of
Management Information Systems, 13(2), 185.
Chau, P. Y. K., & Hu, P. J. H. (2001). Information technology acceptance by individual professionals: A model
comparison approach. Decision Sciences, 32(4), 699-719.
Chin, W. W., Marcolin, B. L., Mathieson, K., & Peacock, E. (2001). Adoption, diffusion, and infusion of it -
extending the technology acceptance model: The influence of perceived user resources. Data Base, 32(3),
86.
Chin, W. W., & Todd, P. A. (1995). On the use, usefulness, and ease of use of structural equation modeling in mis
research - a note of caution. MIS Quarterly, 19(2), 237-246.
Compeau, D. R., & Higgins, C. A. (1995). Computer self-efficacy: Development of a measure and initial test. MIS
Quarterly, 189-211.
Dahlberg, T., Mallat, N., & Öörni, A. (2003). Trust enhanced technology acceptance model - consumer acceptance
of mobile payment solutions. Tentative evidence.
Davis, F. D. (1989). Perceived usefulness, perceived ease of use, and user acceptance of information technology.
MIS Quarterly, 13(3), 319-340.
Davis, F. D., Bagozzi, R. P., & Warshaw, P. R. (1992). Extrinsic and intrinsic motivation to use computers in the
workplace. Journal of Applied Social Psychology, 22(14), 1111-1132.
Dishaw, M. T., & Strong, D. M. (1999). Extending the technology acceptance model with task-technology fit
constructs. Information & Management, 36(1), 9-21.
Engeström, Y. (1987). Learning by expanding. Helsinki: Orienta-konsultit.
Engeström, Y. (1999). Perspectives on activity theory. In R. Miettinen & R.-L. Punamaki (Eds.). New York:
Cambridge University Press.
Fishbein, M., & Ajzen, I. (1975). Belief, attitude, intention and behavior: An introduction to theory and research.
Reading, MA: Addison-Wesley.
Gefen, D. (2000). Lessons learnt from the successful adoption of an erp: The central role of trust. In S. Zanakis, G.
Doukidis & C. Szopounidis (Eds.), Recent developments and applications in decision making. Norwood,
208
MA: Kluwer Academic.
Gefen, D., Karahanna, E., & Straub, D. W. (2003a). Inexperience and experience with online stores: The importance
of tam and trust. IEEE Transactions on Engineering Management, 50(3), 307-321.
Gefen, D., Karahanna, E., & Straub, D. W. (2003b). Trust and tam in online shopping: An integrated model. MIS
Quarterly, 27(1), 51-90.
Gefen, D., & Straub, D. (2000). The relative importance of perceived ease of use in is adoption: A study of e-
commerce adoption. Journal of the Association for Information Systems, 1(8), 1-30.
Gefen, D., & Straub, D. W. (1997). Gender differences in the perception and use of e-mail: An extension to the
technology acceptance model. MIS Quarterly, 21(4), 389-400.
Griffith, P. L., & Laframboise, K. (1997). The structures and patterns of case method talk: What our students taught
us. Action in Teacher Education, 18(4), 10-22.
Igbaria, M., & Iivari, J. (1995). Effects of self-efficacy on computer usage. Omega-International Journal of
Management Science, 23(6), 587-605.
Imel, S. (2000). Contextual learning in adult education. Retrieved December 8, 2004, from
www.cete.org/acve/docgen.asp?tbl=pab&ID=102
Jonassen, D. H. (2000). Revisiting activity theory as a framework for designing student-centered learning
environments. In D. H. Jonassen & S. M. Land (Eds.), Theoretical foundations of learning environments.
Mahawah NJ, London: Lawrence Erlbaum Associates, Publisers.
Jonassen, D. H., Cernusca, D., & Ionas, I. G. (2006). Constructivism and instructional design: The emergence of the
learning sciences and design research. In R. A. Reiser & J. A. Dempsey (Eds.), Trends and issues in
instructional design and technology. (2nd ed.). Upper Saddle River, New Jersey: Merrill/Prentice Hall.
Kuutti, K. (1991). Activity theory and its applications to information systems research and development. In H. E.
Nissen (Ed.), Proceedings of the ifip tc8/wg 8.2 working conference on the information systems research
arena of the 90's: Challenges, perceptions, and alternate approaches (pp. 529-549). New York: Elsevier.
Kuutti, K. (1996). Activity theory as a potential framework for human-computer interaction research. In B. A. Nardi
(Ed.), Context and consciousness: Activity theory and human-computer interaction. (pp. 17-44).
Cambridge, MA: MIT Press.
La Pelle, N. (2004). Simplifying qualitative data analysis using general purpose software tools. Field Methods,
16(1), 85-108.
Lave, J., & Wenger, E. (1991). Situated learning: Legitimate peripheral participation. New York, NY: Cambridge
University.
Lee, J.-S., Hichang, C., Geri, G., Davidson, B., & Ingraffea, T. (2003). Technology acceptance and social
networking in distance learning. Educational Technology & Society, 6(2), 50-61.
Leont'ev, A. N. (1974). The problem of activity in psychology. Soviet Psychology, 13(2), 4-33.
Leont'ev, A. N. (1989). The problem of activity in psychology. Soviet Psychologist, 27(1), 22-39.
Malhotra, Y., & Galletta, D. F. (1999). Extending the technology acceptance model to account for social influence:
Theoretical bases and empirical validation. Proceedings of Thirty-Second Annual Hawaii International
Conference on System Sciences, 1.
Mathieson, K. (1991). Predicting user intentions: Comparing the technology acceptance model with the theory of
planned behavior. Information Systems Research, 2(3), 173-191.
Miles, M. B., & Huberman, A. M. (1994). Qualitative data analysis. Thousand Oaks, CA: Sage Publications.
Moore, G. C., & Benbasat, I. (1991). Development of an instrument to measure the perceptions of adopting an
information technology innovation. Information Systems Research, 2(3), 192-222.
Morris, M. G., & Venkatesh, V. (2000). Age differences in technology adoption decision: Implications for a
changing workforce. Personnel Psychology, 53(2), 375-403.
Nardi, B. A. (Ed.). (1996). Studying context: A comparison of activity theory, situated action models, and
distributed cognition. Cambridge, MA: MIT Press.
Olson, J. R., & Boyer, K. K. (2003). Factors influencing the utilization of internet purchasing in small organizations.
Journal of Operations Management, 21(2), 225-245.
Owens, T., Dunham, D., & Wang, C. (1999). Toward a theory of contextual teaching and learning. Portland, OR:
Northwest Regional Educational Library.
Parnell, D. (2000). Contextual teaching works. Waco, TX: Center for Occupational Research and Development.
Patton, M. Q. (2002). Qualitative research & evaluation methods. (3rd ed.). Thousand Oaks CA, London, New
Delhi: Sage Publications.
Rogers, E. (1993). Diffusion of innovation. New York: The Free Press.
Rose, G., & Straub, D. W. (1998). Predicting general it use: Applying tam to the arabic world. Journal of Global
209
Information Management, 6(3), 39.
Sambamurthy, V., & Chin, W. W. (1994). The effects of group attitudes towards alternative gdss decisions on the
decision-making performance of computer-supported groups. Decision Sciences, 25(2), 863-874.
Sandlin, J. A. (2000). The politics of consumer education materials used in adult literacy classrooms. Adult
Education Quarterly, 50(4), 289-307.
Smith, A. J. (2000). The washington state consortium for contextual teaching and learning booklet. Seattle, WA:
Center for the Study and Teaching of At-Risk Students.
Straub, D., Keil, M., & Brenner, W. (1997). Testing the technology acceptance model across cultures: A three
country study. Information & Management, 33(1), 1-11.
Straub, D., Limayem, M., & Karahannaevaristo, E. (1995). Measuring system usage - implications for is theory
testing. Management Science, 41(8), 1328-1342.
Subramanian, G. H. (1994). A replication of perceived usefulness and perceived ease of use measurement. Decision
Sciences, 25(5-6), 863-874.
Szajna, B. (1996). Empirical evaluation of the revised technology acceptance model. Management Science, 42(1),
85-92.
Taylor, S., & Todd, P. A. (1995). Understanding information technology usage - a test of competing models.
Information Systems Research, 6(2), 144-176.
Thompson, R. L., Higgins, C. A., & Howell, J. M. (1991). Personal computing: Toward a conceptul model of
utilization. MIS Quarterly, 15(1), 125-142.
Tornatzky, L. G., & Klein, K. J. (1982). Innovation characteristics and innovation adaptation-implementation: A
meta-analysis of findings. IEEE Transactions on Engineering Management, 29(1), 28-45.
Venkatesh, V. (1999). Creation of favorable user perceptions: Exploring the role of intrinsic motivation. MIS
Quarterly, 23(2), 239-260.
Venkatesh, V. (2000). Determinants of perceived ease of use: Integrating control, intrinsic motivation, and emotion
into the technology acceptance model. Information Systems Research, 11(4), 342-365.
Venkatesh, V., & Davis, F. D. (1994). Modeling the determinants of perceived ease of use. Paper presented at the
Proceedings of the Fifteenth International Conference of Information Systems, Vancouver, British
Columbia.
Venkatesh, V., & Davis, F. D. (2000). A theoretical extension of the technology acceptance model: Four
longitudinal field studies. Management Science, 46(2), 186-204.
Venkatesh, V., & Speier, C. (1999). Computer technology training in the workplace: A longitudinal investigation of
the effect of mood. Organizational Behavior and Human Decision Processes, 79, 1-28.
Venkatesh, V., Speier, C., & Morris, M. G. (2002). User acceptance enablers in individual decision making about
technology: Toward an integrated model. Decision Sciences, 33(2), 297-316.
Wertsch, J. V. (1998). Mind as action. New Youk: Oxford University Press.
Yi, M. Y., & Hwang, Y. (2003). Predicting the use of web-based information systems: Self-efficacy, enjoyment,
learning goal orientation, and the technology acceptance model. International Journal of Human-Computer
Studies, 59(4), 431-451.
Zakour, A. B. (2004). Cultural differences and information technology acceptance. Paper presented at the 7th
Annual Conference of the Southern Association for Information Systems.
210
Journal Content Analysis of ETR&D and the Journal of Educational
Technology of Korea.
Il-Hyun Jo
Chuncheon National University of Education
Korea
Abstract
The study analyzed the topics and the types of the selected articles from the 20-year-old Journal of
Educational Technology(JET), flagship journal of the Korean Society of Educational Technology. Further the
results were compared with those of the Educational Technology Research & Development journal. Results indicate
that the JET has grown in terms of quantity and quality since its birth 20 years ago. The JET seems to be
independent as a young adult. Some issues and concerns also were identified: The articles of the JET are more
inclined to Experimental design in terms of research type, and to Design in terms of research topic. Balance across
Types and Topics needs be kept for long-term, healthier growth. The emergence of e-Learning and high technology
of Korea offered opportunities to the field. However, the monopoly of e-Learning that consumes most of the limited
space of the JET raises a red flag. More balanced and quality-oriented endeavors were suggested for another 20
years coming.
.
Introduction
Background
The state of the refereed journals is frequently indicative of the status of the research in that field. The field
of educational technology, whenever it encountered critical moment to make decisions, has looked back and learned
from the past by analyzing major journals (Dick & Dick, 1989; Klein, 1997). The Korean Society of Educational
Technology (KSET), the first professional organization of educational technologists in Korea, has been playing a
major role to learn, practice, and share the knowledge in this young field. This year KSET celebrates its 20th
anniversary. Now is time to look back at the Society’s history in order to prepare for the future.
The purpose of the study is to address the history of the research in educational technology in Korea by
comparing the Journal of Educational Technology (JET), flagship journal of KSET, with the Educational
Technology Research & Development (ETR&D) journal. The result of the study will serve as a guide for
educational technologists from Korea and the USA in navigating toward future research and practice.
Method
Sample Articles
For the comparative purpose of the study, 228 articles from both journals were selected for study samples.
All of the articles published in the two journals are from the years of 1985-1986, 1995-1996, and 2003-2004. The
JET does not have Book Reviews and International sections, therefore only articles from the Research Section and
Development Section of ETR&D were included in this study. In total 228 articles from both journals were included
in this study(See Table 2).
211
<Table 2> Number of Target Articles
Early Stage Middle Stage Recent Stage Total
(1985~1986) (1995-1996) (2003-2004)
JET 18 42 50 110
(vol #) (1-2) (11-12) (19-20)
ETR&D 34 42 42 118
(vol #) (33-34) (43-44) (51-52)
(2) Type
A 9-category approach, a revised version of Dick and Dick’s 6-category classification, was utilized for the
analysis of the Type of the content. The category’s 9-descriptors are: 1) Literature review: a summary of literature,
sometimes a critique and sometimes as a statement of the state of the art; 2) Methodological article: a new model or
procedure for carrying out a technical activity; 3) Theoretical article: one which primarily draws upon and
contributes to the theoretical literature in the field; 4) Empirical research/Experimental; 5) Empirical
research/Qualitative; 6) Empirical research/Survey; 7) Descriptive study: a presentation of information about a
particular program or event, with little or no use of data; 8) Evaluation study: a representation of information to
describe the effectiveness of a particular program or method usually in an applied setting; and 9) Professional
article: a description of topics dealing with the profession of instructional technology, such as determination of
competencies or descriptions of internship program(Dick & Dick, 1989. p.82).
(3) Topic
For Topic, this study employs AECT’s 5-category classification, which includes 1) Design (ISD, message
design, message design, instructional strategies/methods, learner characteristics); 2) Development (media utilization,
diffusion of innovation, implementation/institutionalization, policies/regulations); 3) Management (project
management, resource management, delivery system management, information management); 4) Evaluation
(problem analysis, criterion-referenced measurement, formative evaluation, summative evaluation); and 5) Others
(introduction of ideas, learning environment, learning theories) (Seels & Richey, 1994).
(4) Citation
Today’s academic communities are regarded as a knowledge sharing network. In this network, each article
serves as the nodes that are linked with each other by cross-citation. A brief cross-citation analysis was conducted to
find out how the two journals were interrelated and to see how ETR&D helped the infant journal grow into an
independent and productive youth. For this purpose, the citations, both within and between the two comparative
journals and countries, were examined.
3) Coding Process
The researcher and his assistant served as raters for coding and indexing categorical data. The researcher
earned his PhD from one of the major Instructional Systems programs in the USA and has been working as an
instructional designer and researcher for more than 15 years. The research assistant has a master’s degree in
educational technology and is pursuing for a doctoral degree. To enhance inter-rater agreement, the researchers
studied the coding criteria and descriptors carefully, conducting the raters’ workshops two times with actual
samples that were used for Yang and Chung’s work (Yang & Chung, 2005). Overall Cohen’s Kappa of the final
coding reached 0.75. Data coded were entered into SPSS 10.1 (Korean version) for relevant statistical analysis.
Results
Author Categories
(1) First Author’s Basic Information
In ETR&D, 106 unique first authors contributed 118 articles whereas 98 unique first authors contributed
110 articles in JET. More male authors were seen in ETR&D (female 59.3%. female 33.1, unidentified 7.6%) while
212
the JET was represented more by female authors (55.5%. male 44.5%). The level of collaboration was measured by
the number of co-authors. The average number of coauthors for JET was 1.34, which is small than ETR&D’s 2.27.
However, as the community grows, the number of collaborative works has increased from 1.00 to 1.66 (See Table
3).
In its early stage, most pieces shown in the JET dealt with introductory and philosophical issues. As the
JET matured, the contributors had opportunities for applying knowledge and skills learned to instructional design
projects and wrote up articles. Design and development projects usually involve more than one educational
technologists. Korea, a country with high-end computers and Internet environment, requires educational
technologists to assume active roles in high-tech instructional solutions, providing ample opportunities for co-
working in intra-disciplinary, intra-institute projects. In this instructional design environment of Korea, more co-
authored pieces were produced than in years past.
JOURNAL=ETR&D JOURNAL=JET
50 50
40 40
30 30
AUTH_A_T AUTH_A_T
20 aca 20 aca
k-12 rsch
10 10
pvt k-12
0 pblc 0 pvt
early m iddle current early m iddle current
STAGE STAGE
Frequently seen names of the affiliations of the first authors are shown <Table 4> below. In Korea, the
largest – in terms of student enrollment - and oldest academic institute is Ewha Womans University, followed by
Hanyang University. Therefore, especially in early stages, these two institutes played major roles. As KSET matures
and the number of universities with educational technology programs increases, the readers of the JET will enjoy a
greater variety of articles from many institutes with diverse academic and cultural traditions.
213
Early Stage Middle Stage Current Stage
JET 4 pieces(Hanyang U.), 5 pieces(KEDI) 6pieces(Seoul Nat’l U.)
3pieces(Ewha Womans 4pieces(Seonal Nat’l U., 3pieces(Andong Nat’l U.,
U.), Ewha Womans U.), Chonnam Nat’l U.,
2pieces(Korea U., Korea 3pieces(Hanyang U., Hanyang U., Mokwon U.,
Educational Development Samsung HRD Center), Korea Nat’l Open U.,
Institute(KEDI)) 2pieces(Korea Telecom, Kyeongin Nat’l U. of
Kyunghee U.) Edu, Kyunghee, Kwnagju
Taebong Elem. School)
ETR&D 3 pieces(Ohio State) 3 pieces(FSU, UGA) 4 pieces(Open U. of
2pieces(FSU, Harvard, 2pieces(ASU, IU, PSU, Netherlands)
PSU, U.of Colorado, U. U.of Memphis, U.of 2 pieces(Iowa State U.,
of Denver, U. of Minnesota, McGill U., PSU, U.of Missouri, U. of
Minnesota, USC, U. of James Cook U(Canada) Twente(the Netherlands))
Western
Ontario(Canada))
214
TYPE
TYPE
JOURNAL=ETR&D lit JOURNAL=JET lit
20
30
m ethod
m ethod
theoretic
theoretic
em p(exp)
20 em p(exp)
em p(qual)
10 em p(qual)
em p(svy) em p(svy)
10
descriptive descriptive
eval eval
0 professional 0 professional
early m iddle current early m iddle current
STAGE STAGE
[Figure 2] Type of Research in Both Journals
JOURNAL=ETR&D JOURNAL=JET
20 40
TO PIC TO PIC
30
design design
develop develop
10 20
util util
m gm t m gm t
10
eval eval
0 other 0 other
early m iddle current early m iddle current
STAGE STAGE
[Figure 3] Topic of Research in Both Journals
It can be easily recognized that the portion of “Utilization” , “Management”, and “Evaluation” research
consistently has been small in JET. A reason of the deficiency of studies on this practical research area may be
found from the background of the contributors. As discussed earlier, most of the contributors of both journals are
215
working at academic institutes. In Korea, for professors and professional researchers from research institutes, it is
rare to get involved with year-long design-development projects. They do mostly analysis and design, and
sometimes evaluation, but not implementation, which is done by field managers. The field managers usually do not
have time to write journal papers that require methodological rigor. While analyzing the content of the articles of the
JET, another interesting trend was found in the recent stage. Out of fifty articles in the recent stage, 42 articles
(84%) were directly related to the Internet. Words like “Web-Based”, “Internet”, “Cyber”, “ICT” could be found in
almost every titles of the recent articles.
.
<Table 5> Citations in Both Journals
Average # of Citation Early middle current average
19 35.6 41.7 32.1
Inter- JET(self) 0 0.8 2.8 1.2
Journals % 0.0% 2.2% 6.7% 3.7%
ETR&D 0.6 1.5 1 1.0
% 3.2% 4.2% 2.4% 3.2%
Inter- Domestic 2.9 3.3 12.2 6.1
National % 15.3% 9.3% 29.3% 19.1%
Foreign 16.1 32.4 29.5 26.0
% 84.7% 91.0% 70.7% 81.0%
References
Berelson, B. (1952). Content analysis in communication research. New York: Free Press
216
Dick, W., & Dick, D. (1989). Analytical and empirical comparisons of the Journal of Instructional Development and
Educational Communication and Technology Journal. Educational Technology Research and Development,
37(1), 81-87.
Driscoll, M.P., Dick, W.D.(1999). New research paradigms in instructional technology: an inquiry. Educational
Technology Research and Development, 47(2), 7-18.
Klein, J. (1997). ETR&D—Development: An analysis of content and a survey of future direction. Educational
Technology Research and Development, 45(3), 57-62.
Lee, S.(2005). An analysis of trends in WBI researchers published in the major Korean and American journals of
educational technology. Paper presented at the Korean Society of Educational Technology 20th Anniversary
International Conference
Lee, Y.(1985). Celebrating The Publication of the Journal of Educational Technology. Journal of Educational
Technology, 1(1). 1-2.
Seels, B.B., & Richey, R. C.(1994). Instructional Technology: The definition and domains of the field. Washington,
DC: AECT.
Yang, Y., & Chung, H.(2005). “Journal of Educational Technology” 20 years: Analysis on research domains,
research methods, and learning theories applied. Paper presented at the Korean Society of Educational
Technology 20th Anniversary International Conference
217
Exploring the Nature of Training Distance Education Faculty
Haijun Kang
Pennsylvania State University
Introduction
In the 1990s, along with the increasing number of distance education courses was the critical request of
more research on faculty who teach at a distance (Beaudoin, 1990; Dillon & Walsh, 1992). In the 2000s, the
situation has improved and most distance education institutions have been giving more attention to their faculty
teaching at a distance. Evidences are the increasing number of the training programs for distance education faculty
(NCES, 1998) and the increasing number of research on faculty professional deveopment that have been conducted
in the past decade. Despite the fact that those research have touched every aspect of faculty professional
development including participation motivation, faculty recruitment, workload, etc., very few of them have
successfully applied a systems approach. The present study made the attempt of introducing a systems approach to
radically look at one important component of faculty professional development - the training of disatnce education
fauclty, from an innovative perspective.
Statement of Problem
As Olcott and Wright (1995) states, "Without well-trained and equitably rewarded distance education
faculty, there would be no programs" (p. 11). This perception has been widely accepted by practitioners in the past
decade. The National Center for Education Statistics' (NCES, 1998) report found that 60 percent of higher education
institutions offering distance education courses have designed training programs for their distance education faculty.
To name some, Pennsylvania State University’s World Campus has begun providing their faculty training programs
since 1995; University of Florida’s College of Agricultural and Life Sciences (CALS) developed a comprehensive
faculty training program called “Distance Education Faculty Training Program"; and, Illinois Online Network
(ION), an online faculty development initiative, has provided training programs to over 2000 individual faculty
members throughout the state since 1997.
However, literatures have shown that a large number of distance education faculty don’t buy into their
institutions’ efforts. Gunawardena (1990) indicates that “it is crucial to provide faculty training …it is equally
important to familiarize them [faculty] with the total communication process…” (p42); 94.9% subjects in Nhahi’s
research (1999) emphasized the need and importance of training; Schifter (2002, 2004) noted that one of the
inhibitors that prevent faculty from teaching at a distance is the insufficient training provided by institution. There
are also many other research that ended up with similar findings (Wilson, 1998; Betts, 1998; Clay, 1999; Olcott &
Wright, 1995; Wolcott, 2003).
Today's situation, hence, seems to be that while distance education institutions are making efforts to design
and offer training to their faculty, their faculty are still complaining about being lacking of training support. How
has the situation fallen into such a dilemma? Why these training programs' quantity isn't equal to their quality? I
think there are two types of questions that both researchers and practitioners in the field need to ask themselves: (1)
Does the literature reflect the real practice? If so, how can we persuade our government and education institution to
continuously invest money and labor into designing and delivering training to distance education faculty if our
"customers" do not buy into that? (2) But what if what literature has expressed is inconsistent with the reality?
Shouldn't we, the researchers, reflect on the researches that have been done and published? Shouldn't we, the
researchers, assess the assumptions behind those researches, methodologies widely applied, and even seminal
theories/works that have been widely cited?
The discrepancy between the practice and the literature triggered my interest of exploring this phenomenon.
From this study, I wanted to have a better understanding of the nature of training distance education faculty by
exploring the phenomenon from a systems approach. My belief was that a better understanding of the nature of
training would help to explain why there is a discrepancy between the practice and the literature. Introducing a
systems approach – Levis' training and development framework, this study tried to answer this question: What is the
nature of training distance education faculty? This overarching question has two sub-questions: what are distance
education faculty's understandings and experiences of attending training.
Theoretical Framework
In this study, systems approach was introduced. As Moore and Kearsley (2005) state in their 2nd edition of
218
the book Distance Education: a Systems View, that "adopting a systems approach is the secret of successful
practice" (p8). Distance education system is like our body and "building up one part without any attention to the
others is also likely to result in damage to the whole body" (Moore & Kearsley, 2005, p8). Training is an
indispensable part of the distance education system that support distance education faculty and shouldn't be built up
separately from other parts of the whole system. Olcott and Wright's (1995) institutional faculty support framework
is a good model for better understanding of increasing faculty participation in distance education. Similar to their
research approach, I made the effort of introducing a corporate training framework, Levis' (1997) training and
development framework, into this study to explore the notion of training distance education faculty and to address
the question of how to fully take advantage of training to satisfy people involved in the training.
Levis stated his training and development framework in the Editorial of the first issue of the International
Journal of Training and Development in 1997. According to Levis, "The training system may be seen as comprising
organization, strategy, policy and practice" (1997, p. 3). Inputs to the training system are the major factors
influencing the training outputs. Inputs can be seen to include the commitments and base level of skills at both micro
and macro level while training outputs are the effects of training on individual, organizational and national
performance. External stimuli also play an important role in training, such as product/service market and the labor
market. Especially at the era of information/knowledge-based society, various environmental and organizational
changes have been seen as the principal determinants of training. Today's training is not physical-skill-based training
but human-competency-based training. It aims as improving human capital in order to meet modern learning
organization's need of sustainable development. Figure 1 provided the basic ideas about Levis' conceptual
framework.
Determinants Effects
Training Definition
Training, according to Longman Dictionary of Contemporary English published in 1988, is the act of
training or being trained. Merriam-Webster Online Dictionary's definition has two entries for training: (1) training is
"the act, process, or method of one that trains and the skill, knowledge, or experience acquired by one that trains; (2)
training is "the state of being trained". Combining these two resources, we can see that training has these features:
(1) it is a two-way interaction; (2) there should be at least two subjects; (3) one's certain act, process, and method
have impact on the training outcomes; (4) one's skill, knowledge, or experience have impact on the training
outcomes. One thing we should note here is that Longman Dictionary emphasizes the equally important positions of
the subjects involved in the two-way interaction, which means the act of training is equally important to the act of
being trained. Merriam-Webster put more weight on the act of training and the competency of trainer which implies
that the act of being trained is more likely a passive act. This slight difference reflects people's different
understanding or preference while approaching the concept "training" and people's perception of the concept and
their preference will inevitably influence their daily life and work as well. For the purpose of this study, it is
necessary to have an operational definition for "training".
The operational definition is: training is the act, process, or method that has trainer and trainee equally
involved; training requires skill, knowledge, or experience from both trainer and trainee. In this study, (1) trainers
include people who initiative, design, deliver, evaluate training program and trainees refer to the people who attend
the training program with the expectation of getting something out of training either to solve a problem or to achieve
219
a goal that is hard to reach without attending the training; (2) training is a two-way communication but not one-way
indoctrination and it is expected that both trainer and trainee are actively involved in the act and the process of
training (this means that there might be some situation where trainees train trainers). So, in this study, the terms
“trainer” and “trainee” are not exclusively absolute – they are just two terms used to name different people who play
different role in the process of training.
Literature Review
Two sets of literature were located: distance education faculty professional development and training of
distance education faculty. Reviewing the literature on distance education faculty professional development helped
me set up a base for this study because training is an indispensable component of faculty professional development;
and, reviewing literature on training of distance education faculty would tell me what's going on in the practical field
and what have been researched.
220
To summary, faculty professional development in general and training in specific have caught both
practitioners and researchers’ attention for a while. Literature showed that majority training programs had fallen into
three categories: skills, teaching pedagogy, and training needs analysis with a primary emphasis on technology
skills. Another thing that I've noticed from literature review is that most literatures that had explored the issue of
training distance education faculty focused on training's impact on distance education faculty only. The feeling that
I've got from literature is that distance education faculty are the only beneficiary and there are no governments and
distance education institutions' training outcomes. There is a wealth of literature on training structure and content.
Noticing the possible missing part of the literature, I used this study to explore the nature of training distance
education faculty to see whether the literature has failed to present the field, or training in practice actually has
missed something important, or both literature and training in practice are all in good shape and there is no need to
further pursue this topic.
Findings
The results of this study were organized around the two sub-questions: what are distance education faculty's
understanding and experiences of attending training.
221
conclusion.
While being asked whether they have had good training experiences or bad training experiences, their
responses were both positive and negative. They were positive about the training outputs – improvement of
technology and teaching methodology competencies. As one participant said enthusiastically, "But, but clearly, the
training was absolutely, positively, essentially critical. If I had not had the Angel training I’ve had, I can tell you
right now, I would not be using Angel to this date." What they were not very much happy with was mainly the way
the trainers delivered the training. One participant described one bad experience that "It was very long and it was
very drown out. …. I almost fell asleep… and the presenter was very boring". All research participants said that they
need "hands-on" practice and more time to interact with both trainers and other trainees to have better knowledge
and skills retention.
222
Limitations of the Study
This study's contribution to the field was limited by the availability of resources, my understanding of the
field and the most important my biased perception rooted in my cultural-historical background. As Baptiste (2004a)
said that not all phenomenologists “construe the lived experience in the same way”, and my effort in this study was
to explore the phenomenon “given my interests, expertise, time, resources, and power”. Further research is
encouraged to test the validity and generaliability of the systems approach that I tried to propose in this study and/or
to introduce new approaches into the field to improve the situation.
In regards with the research design, this study only explored trainees' understanding of the nature of
training and therefore the result was made based on the one-side opinion. I personally believe that there should be
more interesting points come out if further studies also include people from the State level who initiate and evaluate
distance education/training programs and people from distance education institutional levels who design, deliver
distance education/training programs.
Reference
Bagnasco, A., Chirico, M., Parodi, G., and Scapolla, A. (2003). A model for an open and flexible e-training platform
to encourage companies’ learning culture and meet employees’ learning needs. Educational Technology &
Society 6 (1): 55-63
Baptiste, I. (2004a). An Investigators’ Road Trip: Customizing the Research Process, Unpublished manuscript.
Beaudoin, M. (1990). The instructor's changing role in distance education. American Journal of Distance Education.
4(2): 21-29
Betts, K. (1998). An institutional overview: Factors influencing faculty participation in distance education in
postsecondary education in the United States: An institutional study [on-line]. Online Journal of Distance
Learning Administration. 1(3)
Billett, S. (2002). Toward a workplace pedagogy: guidance, participation, and engagement. Adult Education
Quarterly. 53(1): 27-43
Clark, T. (1993). Attitudes of higher education faculty toward distance education: a national survey. American
Journal of Distance Education.7(2): 19-33
Clay, M. (1999). Development of training and support programs for distance education instructors. Online Journal of
Distance Learning Administration, 2(3).
Dillon, C. & Walsh, S. (1992). Faculty: the neglected resource in distance education. American Journal of Distance
Education. 6(3): 5-21
Gouthro, P. (2002). Education for sale: at what cost? Lifelong learning and the marketplace. International Journal of
Lifelong Education. 21(4): 334-346
Gunawardena, C (1990). Integrating telecommunication systems to reach distant learners. American Journal of
Distance Education. 4(3): 38-46
Irani, T. & Telg, R. (2002). Building it so they will come: assessing universities’ distance education faculty training
and development programs. Journal of Distance Education. 17(1): 36-46
Lewis, P. (1997). A framework for research into training and development. International Journal of Training and
Development. 1(1): 2-8
Lynch, W. Corry, M. (1998). Faculty recruitment, training, and compensation for distance education. Proceedings of
9th “Site 98: society for information technology & teacher education international conference”, March 10-14,
Washington, DC. ERIC.
Moore, M. and Kearsley, G. (2005). Distance education: a systems view (2nd edition). Belmont, CA: Thomson
Wadsworth.
National Center for Education Statistics (NCES). (1998). Distance education in higher education institutions. (NCES
Publication No. 98062). Washington, D.C.: Laurie Lewis, Debbie Alexander, and Elizabeth Farris (Westat, Inc.)
Ndahi, H. (1999). Utilization of distance learning technology among industrial and technical teacher education
faculty. Journal of Industrial Teacher Education, 36(4), 21-37.
Olcott, D. & Wight, S. (1995). An institutional support framework for increasing faculty participation in
postsecondary distance education. American Journal of Distance Education. 9(3): 5-17
Rossman, G. B., & Rallis, S. F. (1998). Learning in the field: An introduction to qualitative research. Thousand
Oaks, CA: Sage Publications.
Schifter, C (2000). Faculty Motivators and Inhibitors for Participation in Distance Education. Educational
Technology; 40 (2): 43-46.
Schifter, C., (2004). Faculty participation in asynchronous learning networks: A case study of motivating and
inhibiting factors. Journal of Asynchronous Learning Networks, 4(1).
223
Sturrock, J. (1982). Tutor training. Distance Education. 4(1): 108-112
Wilson, C. (1998). Concerns of Instructors Delivering Distance Education via the WWW. Online Journal of
Distance Learning Administration, I(3)
Wolcott, L. (2003). Dynamics of Instructor Participation in Distance education: Motivations, Incentives, and
Rewards. In Michael Grahame Moore, William G. Anderson, (eds.), Handbook of Distance Education (pp549-565).
Mahwah, N.J.: Lawrence Erlbaum Associates.
224
Effects of Animation on Multi-Level Learning Outcomes: A Meta-Analytic
Assessment and Interpretation
Fengfeng Ke
Huifen Lin
Yu-Hui Ching
Francis Dwyer
Pennsylvania State University
Abstract
This meta-analytic research assesses the ability of animation to facilitate multi-level learning (factual
knowledge, comprehension, and application) by summarizing the findings of 34 published experimental studies that
involve 13515 subjects. All these studies compared animation with static graphics in an instructional setting. Two
hundred eighty one effect sizes were generated. Results indicated that generally animation has a small and positive
effectiveness in facilitating multi-level learning. It was also found that animation application is not equally effective
on different levels of learning. The study provides the foundation for significant hypothesis generation related to
future development and the instructional use of animations.
Introduction
Due to the wide scale and increasing availability of computers as an instructional delivery system, many
computer-based instructional products have become popular. Among these CBI products, animation has gained
people’s enthusiasm and been widely used in instruction of various subjects, such as physics, mathematics,
mechanics, biology, and computer algorithm.
According to Mayer, Moreno (2002) and Rieber (1991), animation should, in principle, be effective in
illustrating spatial-temporal changes. Theoretical assumptions for instructional applications of animation are that
animation is better than static graphics at communicating information which involves change over time or
directional characteristics, thereby making learning content more concrete, reducing the processing demands in
short-term memory, and increasing the potential for successful encoding into long-term memory (Rieber & Kini,
1991).
Assumptions aside, empirical research on the efficacy of animation over static graphics indicates that
animation may or may not promote learning, depending on how it is designed and used (e.g. Baek & Layne, 1988;
ChanLin, 2001; Spotts & Dwyer, 1996; Rieber, 1990, 1991; Szabo & Poohkay, 1996). Hence an important question
on using animation in instruction is, “When will animation facilitate learning?”
Specifically, this question has been examined by a few animation researchers (e.g. ChanLin, 1998, 2000;
Hay, 1996; Hegarty and Sims, 1994; Koroghlanian & Klein, 2004; Rieber, 1990; Yang, et al., 2003) who suggest
that animation efficacy may vary for different levels of learning objectives, with the differing spatial abilities of
students. But their findings are inconclusive. For instance, Beheshti, Breuleux, & Renaud (1994) claimed that
animation promoted procedural learning but not descriptive knowledge, while ChanLin (1998) observed that
animation promoted both descriptive and procedural learning. Yang, Andre, and Breenbowe (2003) found that
higher spatial ability learners benefit more from animation, while Hays (1996) got contradictory conclusion.
Therefore, a meta-analytic synthesis of these empirical findings is warranted to present a lucid observation on
animation’s efficacy in facilitating different levels of learning.
Research Purpose
In general, this meta-analysis seeks to answer the following questions: 1) Does animation compared to
static graphics improve different levels of learning achievement and other outcomes related to attitude or learning
efficiency? If so, to what extent? 2) Does spatial ability moderate the effects of animation on student achievement
and other outcomes? If so, to which direction and to what extent?
225
conference proceedings, and the reference lists of several reviews. To be included in this review, each study also met
the following criteria for inclusion:
• Each study must involve an experimental comparison of animation with static graphics in an instructional
setting. Randomization for subject assignment and statistical data for calculating effect sizes must be
present.
• The publishing time for each study should be between 1985 and 2004.
Coding Procedures
All eligible study reports were read and coded on three separate occasions by only one of the authors, with
a 1-month interval, to ensure the coding was accurate and consistent. A detailed coding sheet was designed to
facilitate the extraction of information from the studies. The authors initiated the development of the codebook by
preliminarily reviewing a sample of twenty studies, then doing nomological coding to identify salient study features
present in the literature (Abrami, et al., 1988). A comprehensive codebook was constructed, which include the
feature categories of “research design,” “subjects (e.g. number, prior-knowledge, age),” “subject content,”
“independent variable(s),” “covariate(s),” “learning outcome (factual knowledge, comprehension, application,
attitudes, learning efficiency),” “measurement instruments (e.g. reliability, test items),” “kinds of animation,” and
“results.”
Inter-rater checking and agreement was adopted during studies inclusion and coding process.
Disagreements were resolved through discussion and possible recoding. One hundred fifty six manuscripts have
been reviewed and thirty four have been selected as complying with the criteria established for inclusion in this
meta-analytic procedure.
Outcome Measures
Learning achievement as measured by post-treatment test performance was the primary cognitive learning
outcome considered in our meta-analytic review. Based on the descriptions in the reviewed studies, we classified
these achievement tests into three cognitive achievement categories: factual knowledge, comprehension, and
application (Bloom, 1956).
Among the thirty four studies included in the meta-analysis, nine of them evaluated the effects of animation
by comparing the animation and the static graphics groups in not only the accuracy of test response but also the time
needed for test or task performance. Effect sizes for both accuracy and speed measures were calculated.
Different from most of the studies, three research projects evaluated learning effectiveness of animation
also through attitudes survey. Then, four studies provided measure of time when participants interacting with
learning materials (animation versus static graphics). In our meta-analysis, attitudes and material-interacting time
were included as two indexes of affective learning outcome (attitudes and engagement) in our meta-analysis.
Where multiple effect sizes were calculated for individual studies, they were averaged to ensure that each
unit of analysis contributed just one effect size to the review (Rosenthal, 1991).
Analysis
The index of standardized mean difference effect size that we used was the unbiased estimator d (Glass, et
al. 1981). This index is calculated as the difference between the means of the treatment and control groups divided
by the pooled standard deviation of the sample and corrected for small sample bias. Calculations of effect sizes and
mean weighted effect size were performed using the procedures suggested by Glass et al. (1981), Hedges (1994).
The mean effect size was calculated as weighted average, with each effect size being weighted by the inverse of its
variance. The procedure gave proportionally greater weight to effect sizes based on larger samples (Shadish &
Haddock, 1994). In the interpretation, an effect size of .20 was defined as small, an effect size of .50 as medium,
and an effect size of .80 or above as large (Cohen, 1977).
226
Results
Animation on Multi-Level Learning
Thirty four studies included in our meta-analysis reported small or trivial positive effects of animation in
promoting learning. The overall mean weighted average effect size was d+ = 0.313, with 95% confidence interval
(CI) being 0.277 to 0.349. Based on Cohen’s (1977) U3 measure, it can be interpreted that, on average, students at
the animation group improved multi-level learning achievement at a rate of 62%, as compared with 50% in the
students at static graphics group. Detailed information on the measurement and individual effect sizes of 34 studies
are illustrated in Table 1.
227
Lai, 2000 college Programming Comprehension instant 0.09
concepts test
Comprehension delayed 0.26
test
Lai, 2000 College Programming Comprehension test 0.87
Concepts
Wright & Milroy, 1999 Adult Reading on historic Instant quiz on -0.17
events comprehension
Delayed quiz -0.28
Byrne, Catrambone, & College Algorithm Basic questions -0.07
Stasko, 1999
Challenging questions 0.34
Concept comprehension -0.31
Hegarty, Kriz, & Cate Undergraduate Flushing cistern causal chain 0.42
mechanical system comprehension
description (open-ended
questions)
Williamson & Abraham, Undergraduate Chemistry Conceptual 0.56
1995 understanding (Unit 5)
Chanlin, 2001 8th & 9th Grade Physics Descriptive learning -2.99
Descriptive Learning 3.14
Thompson & Riding, 11-14 years old Math Comprehension test 0.38
1990
Wilcox, 1997 7 & 8 years old Math Conceptual Subset -0.02
Ausman et al., 2006 Undergraduate Human Heart Comprehension test 0.09
Speed
Study Participant Ed. Level Content Measurement ES'
Rieber 1989 4th, 5th, & 6th Grade Physics (Newton's Processing time 1.64
Law)
Lai, 2000 college Programming Task time 0.00
concepts
Lai, 2000 College Programming Task time -0.07
concepts
Study investigating the effects of animation on Application
Accuracy
Study Participant Ed. Level Content Measurement ES'
Yang, Andre, & College Electronic Chemistry Transfer 0.28
Breenbowe, 2003
Rieber, 1991 4th grade Physics (Newton's Immediate intentional 0.53
Law)
Immediate incidental 1.92
Delayed intentional 0.81
Delayed incidental 1.97
Baek & Layne, 1988 High School Math Rule-learning test 0.35
Rieber, Boyce, & Assad, College Physics (Newton's Rule-learning test 0.02
1990 Law)
Rieber, 1990 4th & 5th Grade Physics (Newton's Rule-learning test 0.53
Law) Objective 1 (difficulty
level)
228
Rule-learning test 0.45
Objective 2
Rule-learning test 0.20
Objective 3
Rule-learning test 0.21
Objective 4
Rule-learning test -0.01
Objective 5
Rule-learning test 0.36
Objective 6
Szabo, Michael, college Math Problem-solving test 0.76
Poohkay, & Brent, 1996
Rieber 1989 4th, 5th, & 6th Grade Physics (Newton's Application (near 0.09
Law) transfer)
Application(far 0.12
transfer)
Rieber 1991 4th Grader Physics (Newton's Incidental learning 0.06
Law)
Intentional learning 1.18
Byrne, Catrambone, & College Algorithm Procedure 0.11
Stasko, 1999
Rieber & Boyce, 1991 College Computer education Verbal near transfer 0.02
Verbal/visual near 0.25
transfer
Verbal far transfer -0.06
Verbal/visual far 0.05
transfer
Hegarty, Kriz, & Cate Undergraduate Flushing cistern Trouble-shooting 0.15
mechanical system
Rieber, Boyce & Assad, Undergraduate Physics (Newton's Rule-learning test 0.02
1990 law)
Poohkay & Szabo, 1995 Undergraduate Math Drawing test 0.81
Rieber,1989 4th & 5th Grade Physics (Newton's Rule-learning test -0.01
Law)
Lewalter, 2003 Undergraduate Physics Comprehension + 0.60
Problem Solving
Wong, 1994 Undergraduate Statistics Transfer Question -0.25
Chanlin, 2001 8th & 9th Grade Physics Procedure Learning -0.71
Lee, 1996 Undergraduate/Gradua Operation of Bicycle Problem-solving test 0.76
te Tire Pump
Wilcox, 1997 7 & 8 years old Math Rule Subset -0.05
Performance Subset 0.43
Speed
Study Participant Ed. Level Content Measurement ES'
Baek & Layne, 1988 High School Math Rule-learning test -1.19
Response latency
Rieber, Boyce, & Assad, College Physics (Newton's Rule-learning test -0.46
1990 Law) Response latency
229
Park & Gittelman, 1992 College Electronic circuits Number of trials in 0.92
trouble shooting
practice
Time needed in practice 0.24
response
Number of trials in 0.45
trouble shooting test
Time needed in test 0.09
Park, 1998 College Electronic circuits Number of trials in 0.22
performance test
Time spent on 0.07
performance test
Number of trials in 0.43
transfer test
Time spent on transfer 0.13
test
Rieber & Boyce, 1991 College Computer education Test time in verbal near 0.13
transfer
Test time in 0.27
verbal/visual near
transfer
Test time in verbal far 0.06
transfer
Test time in -0.10
verbal/visual far
transfer
Wong, 1994 Undergraduate Statistics Time on quiz -0.03
Study investigating the effects of animation on Affective Learning Outcome
Engagement
Study Participant Ed. Level Content Measurement ES'
Koroghlanian & Klein, High School Biology Time in instruction 0.57
2004
Time in program 0.64
Spotts & Dwyer, 1996 College Human Heart Study time 2.14
Total time 2.80
Wright & Milroy, 1999 Adults Reading on historic Study time -0.14
events
Wong, 1994 Undergraduate Statistics Time on tutorial 0.73
Attitudes
Study Participant Ed. Level Content Measurement ES'
Koroghlanian & Klein, High School Biology Attitudes survey 0.20
2004
Lai, 2000 College Programming Attitudes survey 0.17
Concepts
Poohkay & Szabo, 1995 Undergraduate Math Attitudes survey 0.04
Table 2 reports the weighted mean effect sizes for animation over static graphics on different cognitive and
affective learning outcomes, as well as their 95% confidence intervals.
230
Factual Knowledge 11 0.36 +0.28 to 0.44
Comprehension Accuracy 13 0.21 +0.13 to 0.28
Speed 3 0.26 +0.10 to 0.42
Application Accuracy 19 0.34 +0.28 to 0.41
Speed 6 0.10 +0.00 to 0.20
Engagement 4 0.83 +0.62 to 1.03
Attitudes 3 0.13 -0.09 to 0.35
Total 34 0.31 +0.28 to 0.35
Note. d+ = weighted mean effect size; CI = confidence interval; No effect sizes calculated for the Speed in Factual
Knowledge because no timing data were provided in the relative studies.
As Table 2 shows, most mean weighted effects of animation on different types of learning outcomes are
positive but smaller than .5, reflecting a small and positive association between animation and gains in learning.
Because almost all confidence intervals (except the one on attitudes) do not contain zero, it may be concluded with
certainty that the true effect size is not zero. Moreover, the mean weighted effect size of animation on learning
persistence is d+ = 0.83, with 95% confidence interval (CI) being 0.62 to 1.03, reflecting a large effectiveness of
animation over static graphics on promoting learning persistence. According to Cohen (1988), this results means
that 80% of the students who received instruction with animation demonstrated more learning persistence than the
average students who received instruction with static graphics.
Table 2 also shows that animation played a more important role in helping students develop factual
knowledge (d+ = 0.36) and accuracy in application test or performance (d+ = 0.34). That is to say, around 64% of
the students in animation group scored higher in the tests corresponding to factual knowledge and application than
the average students in the static graphics group. In comparison, animation had fewer effects over static graphics on
promoting comprehension (Accuracy d+ = 0.21, Speed d+ = 0.26) or positive learning attitudes (d+ = 0.13).
Hays’ finding evidenced that low spatial ability learners benefited more from animation than high spatial
ability learners did in both short-term and long-term comprehension, which was contrary to the findings of many
animation studies. However, it should be noted that all 95% confidence intervals (CI) of the effect sizes have big
range, indicating that the actual effect sizes could be very different from the observed ones.
231
References
Abrami, P. C., Cohen, P.A., & d'Apollonia, S. (1988). Implementation problems in meta-analysis. Review of
Educational Research, 58, 151–179.
Ausman, B. Lin, Huifen, Kidwai, K., Munyofu, M., Swain, J. Dwyer, F. (2006). Effect of Prerequisite Knowledge
on the Effectiveness of Animated Instruction. International Journal of Instructional Media, 35(3). (in
press).
Baek, Y. K. & Layne, B. H. (1988). Color, graphics, and animation in a computer-assisted learning tutorial lesson.
Journal of Computer-Based Instruction, 15(4), 131-135.
Blankenship, J. & Dansereau, D. F. (2000). The effects of animated node-link displays on information recall. The
Journal of Experimental Education, 68(4), 293-308.
Bloom, B.S. (1956). Taxonomy of Educational Objectives: Handbook of Cognitive Domain. New York: McKay.
Byrne, M. D., Catrambone, R., & Stasko, J. T. (1999). Evaluating animations as student aids in learning computer
algorithms. Computers & Education, 33, 253-278.
ChanLin, L. J. (1998). Aniamtion to teach students of different knowledge levels. Journal of Instructional
Psychology, 25(3), 166-176.
ChanLin, L. J. (2000). Attributes of animation for learning scientific knowledge. Journal of Instructional
Psychology, 27(4), 228-238.
ChanLin, L. (2001). Formats and prior knowledge on learning in a computer-based lesson. Journal of Computer
Assisted Learning, 17, 409-419.
Cohen, J. (1977). Statistical power analysis for the behavioral sciences, New York: Academic Press.
Craig, S. D., Gholson, B., Driscoll, D. M. (2002). Animated pedagogical agents in multimedia educational
environments: Effects of agent properties, picture features, and redundancy. Journal of Educational
Psychology, 94(2), 428-434.
Glass, G.V., McGaw B., & Smith, M.L. Meta-analysis in social research, CA: Sage.
Hays, T. A. (1996). Spatial abilities and the effects of computer animation on short-term and long-term
comprehension. Journal of Educational Computing Research, 14, 139-155.
Hedges, L.V. (1994). Meta-analysis. Journal of Educational Statistics 17, 279–296.
Hegarty, M., Kriz, S., & Cate, C. (2003). The roles of mental animations and external animations in understanding
mechanical systems. Cognition and Instruction, 21(4), 325-360.
Huk, T., Steinke, M., & Floto, C. (2003). The educational value of cues in computer animations and its dependence
on individual learner abilities. Proceedings of Ed-Media 2003, 2658-2661.
Huk, T., Steinke, M., & Floto, C. (2003). The influence of visual spatial ability on the attitude of users towards
high-quality 3D-aniamtions in hypermedia learning environments.. Proceedings of E-Learn 2003, 1038-
1041.
Iheanacho, C. C. (1997). Effects of two multimedia computer-assisted language learning programs on vocabulary
acquisition of intermediate level ESL students. Dissertation Abstracts International, DAI-A 62/08, 2671.
Koroghlanian, C. & Klein, J. D. (2004). The effect of audio and animation in multimedia instruction. Journal of
Educational Multimedia and Hypermedia, 13(1), 23-46.
Lai, S. (2000a). Increasing associative learning of abstract concepts through audiovisual redundancy. Journal of
Educational Computing Research, 23, 275-289.
Lai, S. (2000b). Influence of audio-visual presentations on learning abstract concepts. International Journal of
Instructional Media, 27(2), 199-206.
Large, A. & Beheshti, J. (1997). Effects of animation in enhancing descriptive and procedural texts in a multimedia
learning environment. Journal of the American Society for Information Science, 47(6), 437-448.
Lee, S. (1996). The effects of computer animation and cognitive style on the understanding and retention of
scientific explanation. Dissertation Abstracts International, DAI-A 57/10, 4248.
Lin, C. & Dwyer, F. (2004). Effect of varied animated enhancement strategies in facilitating achievement of
different educational objectives. International Journal of Instructional Media, 31(2), 185-197.
Mayer, R. E. & Moreno, R. (2002). Aids to computer-based multimedia learning. Learning and Instruction, 12, 107-
119.Park, O. (1998). Visual displays and contextual presentations in computer-based instruction.
Educational Technology, Research, and Development, 46(3), 37-50.
Mayer, R. E. & Sims, V. K. (1994). For whom is a picture worth a thousand words? Extensions of a dual-coding
theory of multimedia learning. Journal of Educational Psychology, 86(3), 389-401.
Park, O. & Gittelman, S. (1992). Selective use of animation and feedback in computer-based instruction. Education
Technology Research and Development, 40(4), 27-38.
232
Poohkey, B. & Szabo, M. (1995, Feb). Effects of animation & visuals on learning high school mathematics. Paper
presented at the Annual Meeting of the Association for Educational Communications and Technology,
Anaheim, CA.Rieber, L. P. (1989, Feb). The effects of computer animated lesson presentation and
cognitive practice on adult learning in physical science. Papers presented at the Annual Meeting of the
Association for Educational Communications and Technology, Dallas, TX.
Rieber, L. P. (1989, Feb). The effects of computer animated lesson presentations and cognitive practice activities on
young children’s learning in physical science. Paper presented at the Annual Meeting of the Association
for Educational Communications and Technology, Dallas, TX.
Rieber, L. P. (1989). The effects of computer animated elaboration strategies and practice. Journal of Educational
Computing Research, 5(4), 431-444.
Rieber, L. P. (1990). Using computer animated graphics in science instruction with children. Journal of Educational
Psychology, 82(1), 135-140.Rieber, L. P. (1991). Animation, incidental learning, and continuing
motivation. Journal of Educational Psychology, 83(3), 318-328.
Rieber, L. P. (1991). Effects of visual grouping strategies of computer-animated presentations on selective attention
in science. Educational Technology, Research, and Development, 39, 5-15.
Rieber, L. P. & Boyce, M. J. (1991). The effects of computer-based interactive visuals as orienting and practicing
activities on retrieval tasks in science. International Journal of Instructional Media, 18(1), 1-17.
Rieber, L. P., Boyce, M. J., & Assad, C. (1990). The effects of computer animation on adult learning and retrieval
tasks. Journal of Computer-Based Instruction, 17(2), 46-52.
Rosenthal, R. (1991). Meta-analytic procedures for social research, CA: Sage.
Sperling, R. A., Seyedmonir, M., Aleksic, M., & Meadows, G. (2003). Animations as learning tools in authentic
science materials. International Journal of Instructional Media, 30(2), 213-221
Spotts, J. & Dwyer, F. (1996). The effect of computer-generated animation on student achievement of different
types of educational objectives. International Journal of Instructional Media, 23(4), 365-375.
Shadish, W.R. & Haddock, (1994). Combining estimates of effect size. The handbook of research synthesis, C.K.
H. Cooper & L.V. Hedges (Eds.), New York: Russell Sage, 261–281.
Szabo, M. & Poohkey, B. (1996). An experimental study of animation, mathematics achievement, and attitude
toward computer-assisted instruction. Journal of Research on Computing in Education, 28(3), 0888-6504.
Thompson, S. V. & Riding, R. J. (1990). The effect of animated diagrams on the understanding of a mathematical
demonstration in 11- to 14-year-old pupils. British Journal of Educational Psychology, 60(1), 93-98.
Wender, K. F. & Muehlboeck, J. S. (2003). Animated diagrams in teaching statistics. Behavior Research Methods,
Instruments, & Computers, 35(2), 255-258.
Wilcox, D. M. (1997). The use of animation with instruction and feedback in fractions software for children.
Dissertation Abstracts International, DAI-A 58/08, 3099.
Williamson, V. M. & Abraham, M.R. (1995). The effects of computer animation on the particulate mental models of
college chemistry students. Journal of Research in Science Teaching, 32, 521-534.
Wong, A. Y. K. (1994). The use of animation in computer assisted instruction. Dissertation Abstracts International,
DAI-A 55/12, 3794.
Wright, P., Milroy, R., & Lickorish, A. (1999). Static and animated graphics in learning from interactive texts.
European Journal of Psychology of Education, XIV(2), 203-224.
Yang, E. & Andre, T. (2003). Spatial ability and the impact of visualization/animation on learning electrochemistry.
International Journal of Science Education, 25(3), 329-349.
233
Effects of Integrated Motivational and Volitional Tactics on Study Habits,
Attitudes, and Performance
John M. Keller
Florida State University
Markus Deimann
Erfurt University
Germany
Zhu Liu
Florida State University
Abstract
A continuing challenge is how to stimulate and sustain learner motivation and persistence in
undergraduate general education courses. Most controlled research studies do not generalize to this setting because
they typically implement treatments of 30 to 50 minutes so that they can be completed in a single class period
(Azevedo & Cromley, 2004). The challenges to motivation that occur during a semester-length course or a
significant portion of the course are much different from a “single sitting” research study in which there is hardly
time to overcome the novelty effects of an intervention before the experiment is finished. Also, in a longer study, the
motivational factors that are present at the beginning of a learning experience cannot be expected to persist over a
long period of time unless other things are done to help sustain learner motivation and persistence. The present
study took place over a four-week module in a large undergraduate course and incorporated a variety of tactics
designed, in accordance with supporting theories, to assist students in maintaining their motivation and self-
regulatory habits during this period of time. To provide a means for the rational selection and creation of
motivational and volitional tactics, the ARCS model (J. M. Keller, 1987, 2004) of motivational design was expanded
to incorporate the volitional theories of Gollwitzer (1999) and Kuhl (1987). The effectiveness of this approach was
tested by distributing the strategies as “motivational messages” (Visser & Keller, 1990) in the form of “Study Tips”
via email to the participants in this study. The primary finding was that students who opened the study tips emails
increased their study time, maintained confidence, and improved their test scores compared to those who did not
open them. This has positive implications for sending motivational and volitional study tips directly to students
while they are in the process of studying a course.
Introduction
Background
Historically, motivation was considered to have two levels. The first is “will,” which refers to a person’s
desires, wants, or purposes together with a belief about whether it is within one’s power to satisfy the desire, or
achieve the goal (James, 1890; Paul R. Pintrich & Schunk, 2002). The second level is the act of using the will, or
“volition,” which refers to a process for converting intentions into actions. In some cases, the mere saliency of a
desire is sufficient to lead more or less automatically to action, but often, as William James (1890) pointed out, it is
necessary to have a conscious effort supported by determination or extrinsic requirements to convert intentions into
action.
Much of motivation research has focused on understanding what people’s goals are and why they choose
them. For example, the original conceptualization of “will” as being a combination of desires and beliefs about
being able to achieve them is reflected in expectancy-value theory which postulates that behavior potential is a
function, assumed to be multiplicative, of the perceived importance of a given goal in relation to other goals (value)
and one’s subjective probability of being able to achieve the goal (expectancy). While this theory has had a powerful
influence in motivational theory, its sole contribution is in explaining how people choose a particular goal or set of
goals. It does not fully explain volition, or what impels people to action and keeps them working persistently to
achieve a goal. Consequently, a distinction between “selection motivation” and “realization motivation” has to be
made (Kuhl, 1985). Modern conceptions of volition such as action control (Kuhl, 1987), implementation intentions
(Gollwitzer, 1999), as well as work on self-regulation (Zimmerman, 1998a) are based upon this distinction. All of
234
these pertain to the problem of maintaining goal-oriented behavior and overcoming discouragement and attrition,
problems that have been experienced especially in self-directed learning environments including e-learning, and
even classroom courses that put a high level of scheduling control into the students’ hands or in which there are
large numbers of students who are taking the course to meet a requirement.
Kuhl (1985) defines volition as a mediating factor that “energizes the maintenance and enactment of
intended actions" (Kuhl, 1985, p. 90) and therefore goes beyond motivation. In other words, strong motivation is a
necessary yet not a sufficient condition. Wolters (1998) commented about how students can express strong desires to
accomplish a goal but have a very difficult time in managing competing goals and distractions that interfere with
their academic work. Similarly, Pintrich and Garcia (1994) pointed out that the influence of volition becomes even
more important for college students “who, when you talk to them, are very motivated and concerned about doing
well, but often have a very difficult time enacting their intentions, given all the internal and external distractions
they confront in college life” (p. 126f). These observations are, of course, readily apparent to anyone, teachers or
counselors, who try to facilitate change in people. The interesting point is that this phenomenon has been coming
under greater and greater scrutiny in psychological research. Kuhl’s action control theory was developed to bridge
the intention-behavior gap and to help people overcome maladaptive behaviors in their life. Even though his theory
is only recently being applied to learning environments and has not yet been applied in multimedia settings, it has
served as a foundation for the work of Zimmerman (1998b) and Corno (2001) who study volitional behaviors in
self-regulated learning.
In the theory of action control, Kuhl (1987) specifically addresses the question of what factors influence a
person’s continued and persistent efforts to accomplish a goal. Kuhl’s theory postulates six action control strategies
which can be employed as soon as an action tendency achieves the status of a current intention (by committing to
the action). In other words, commitment to achieving a given goal is a prerequisite to employing the set of action
control strategies, which are:
1. Selective attention: also called the “protective function of volition” (Kuhl, 1984, p. 125): it shields the current
intention by inhibiting the processing of information about competing action tendencies.
2. Encoding control: facilitates the protective function of volition by selectively encoding those features of
incoming stimulus that are related to the current intention and ignoring irrelevant features.
3. Emotion control: managing emotional states to allow those that support the current intention and suppress those,
such as sadness or attraction, in regard to a competing intention that might undermine it.
4. Motivation control: maintaining and reestablishing saliency of the current intention, especially when the
strength of the original tendency was not strong (“I must do this even though I don’t really want to.”)
5. Environment control: Creating an environment that is free of uncontrollable distractions and making social
commitments, such as telling people what you plan to do, that help you protect the current intention.
6. Parsimonious information processing: Knowing when to stop, making judgments about how much information
is enough and to make decisions that maintain active behaviors to support the current intentions.
Kuhl assumes that processes of action control underlie virtually any kind of activity, but especially those in
which the person faces difficulties and hindrances. The effectiveness of employing action control strategies has been
confirmed in many studies in a variety of behavior change settings (Kuhl, 1987) as well as in educational settings
(Corno, 2001; Kuhl, 1984; Zimmerman, 1998a). However, action control theory does not provide detailed
examination of intention commitment, or implementation intentions. For this, Gollwitzer’s work (Gollwitzer, 1999)
on volition is helpful.
The first step in moving from desire to action, that is, from the identification and acceptance of a personal
goal to a set of actions to accomplish the goal is that of intention formation. On the one hand, the concept of “good
intentions” is used as a rationalization when things go wrong, or an excuse for not taking action as in the expression,
“the road to hell is paved with good intentions.” But, on the other hand, intentions can be a powerful influence on
goal accomplishment. In a laboratory study with preschool children who were asked to work on a repetitive, boring
task that was interrupted with a tempting distraction (a clown head encouraging children to select and play with toys
instead of working on their assigned task), Patterson and Mischel (Patterson & Mischel, 1976) tested the effects of
task-facilitating intentions versus temptation-inhibiting intentions. The children were told that a clown box might
tempt them to stop working. The task-facilitating group was told to keep their attention on the task if this happened,
and the temptation-inhibiting group was told to direct their attention away from the clown box. This study and
subsequent research (Gollwitzer & Schaal, 2001) shows that temptation-inhibiting intentions have the superior effect
no matter whether motivation to perform the task is high or low.
Adding volition to the motivational design process may be of particular benefit to students in large
undergraduate lecture courses in which many of the students are enrolled to fulfill a general education requirement
rather than being in their major area of interest. Problems in these courses include such things as procrastination,
235
ineffective study habits, lack of perceived relevance of the content to their lives, low personal priority for the course
requirements, and not knowing how to build resistance against distractions that occur during their available time for
study. The work of Zimmerman (1998a), Corno (1993), and others on self-regulation has had some success in
improving volitional behaviors, but the problems persist, especially when one moves outside the controlled study
environment to an actual classroom.
Another major issue in research on self-regulated learning pertains to the availability of volitional
strategies. Previous research finding indicate that learners do not posses adequate strategies to deal with outside or
inside interferences (Bannert, 2004). Therefore, providing learners volitional strategies can help in establishing
volitional competence. Moreover, much of the previous research in the areas of motivation and volition deals with
isolated aspects of attitudes and behavior instead of being grounded in a more holistic theory of motivation and
volition. Also, the interventions tend to be presented at the beginning of the treatment (e.g. Azevedo & Cromley,
2004). The present study, in contrast to this research, expands the ARCS model of motivational design to include
volitional design, and also distributes strategies in two different ways. One approach was to bundle all of the
strategies, called ‘Study Tips’, into one booklet and send it as an email attachment at the beginning of the treatment
period. The second approach was to distribute the strategies throughout the four-week treatment period via email at
those times when the strategies would be most likely to be immediately useful. We also included a placebo group
which received messages with information and humor that was related to the topic of the course but tangential to its
formal content and tests. The purpose of having a placebo group was to control for potential reactive effects that
might result from the novelty of sending numerous and diverse emails to the class, regardless of their content. It is
common in studies of motivation to fail to control for novelty effects, but in this study all three treatment groups
received the placebo messages to determine whether the designed motivational and volitional messages in the
distributed and bundled treatments had an effect independently of the novelty influences.
In summary, the purpose of this study was to test the effectiveness of a combined set of motivational and
volitional strategies on the motivation and persistence of a group of undergraduate students in a general education
course that satisfies one of their curricular requirements. It was expected that the blending of motivational and
volitional strategies and distributing them at the most appropriate times would result in higher levels of
improvements in study habits, attitudes toward the course, and learning performance than when bundled and
distributed all at once in a booklet, but that both of these treatments would be superior to the placebo group.
Method
Participants
Participants in this study were 90 of the 115 students in an undergraduate archaeology course who
indicated their willingness to participate by filling out a pre-treatment questionnaire of study habits, volitional
habits, and course-specific motivational attitudes. Twenty-five of the original participants were eliminated because
they failed to return 3 or more of the 10 weekly logbooks.
Research Design
In the first set of analyses, there was one independent variable, message type, with three levels: bundled
messages, distributed messages, and placebo. For the second set of analyses, there was one independent variable,
study tip use, with two levels: opened study tips versus unopened study tips. Repeated measures analyses were
conducted in both sets of analyses because pre- and post-measures were taken on each of the dependent variables
consisting of study habits as measured by study time, three components motivational attitudes toward the course
(interest, relevance, and confidence) as measured by the appropriate scales in the Course Interest Survey (Keller &
Subhiyah, 1993), and achievement as measured by test grades.
236
result was that fewer than half of the participants did so. Therefore, the researchers decided to add an ad hoc
independent variable which was study tip use with two levels consisting of those who looked at the study tips and
those who did not. Since the means of the two groups were almost identical (Mbundled = 1.68; Mdistributed = 1.67) with
respect to how many opened the study tips (1 = yes; 2 = no) the distinction between bundled and distributed was not
used in the analyses of this independent variable.
The first dependent variable was Study Time. Based on the self-reported data in the participant logbooks
which were submitted by weekly email, the study time prior to the first test was compared to the study times from
the first to the second test. Participants reported time spent studying the text and time spent on a special project
assigned to the class. These were summed to compute total study time.
The second dependent variable was measured by using the attention, relevance, and confidence subscales
from the Course Interest Survey. The satisfaction scale was not used because it was not pertinent to this particular
study. This CIS is a situation-specific survey which has satisfactory reliability estimates as measured by Crohbach’s
alpha formula (rattention = .84, rrelevance = .84, rconfidence = .81). Each of these subscales was used as a separate measure.
The third dependent measure was test grade on Test 1 compared with Test 2. These tests were those used by the
instructor in the normal process of teaching and assessing. The researchers did not modify the tests and were not
present when they were administered.
All of these analyses were conducted with repeated measures using the general linear model to control for
differences in the pre-treatment scores and to determine whether there were significant shifts within and between
groups. Even though there were multiple dependent variables, MANOVA was not considered given that this was an
exploratory study and the number of participants would not support it. Also, for these same reasons, a confidence
interval of .10 was chosen in place of the customary .05. The findings of this study will provide a basis for future,
more tightly controlled studies.
Materials
The materials used in this study for collecting data consisted of weekly logbooks that were sent to the
participants by email and which were returned via email by the participants to the researchers. The researchers set up
a second course website using Blackboard, which is the system used by this university. It was identical to the
instructor’s primary website except that she did not have access to it. Thus, the participants were assured of
confidentiality in their responses. The lead researcher had access to the instructor’s website in order to get copies of
grades.
Study tips were created in accordance with the motivational and volitional strategies that were selected for
use with these participants. These decisions were based upon audience information obtained from interviews with
the course instructor and her graduate student, as well as the researcher’s knowledge of relevant research and direct
experience with similar audiences. A total of six strategies were produced. Each of these consisted of two or more
pages of information and graphics. All of them were put together into one package for the bundled group and kept
separate for the distributed group. The only other difference between the two groups was that in the emails that
contained these strategies there were slightly different comments due to the bundled versus distributed situations.
The titles, motivational and volitional focus, and brief explanatory comments are contained in Table 1.
Procedure
On the first day of class, the researchers and two additional persons attended to pass out and collect a
survey of study habits and attitudes and course-specific motivation. Participation was voluntary. If students filled out
and returned the questionnaires, it indicated their willingness to participate. These measures were not used in the
present study. Also, this was the only time the researchers had face-to-face contact with the class.
Beginning immediately after Week 1, logbooks were sent to students each week. The contents always
included questions about time spent studying. Some logbooks contained other questions pertaining to motivation and
other attitudes.
The logbook that was distributed at the end of the third week class asked for study times and also asked
about motivational attitudes (interest, relevance, and confidence). These served as the pre-measures for this study.
The first test was given during the following (fourth) week of class.
The logbook that was distributed at the end of the seventh week class once again asked for study times and
also asked about motivational attitudes (interest, relevance, and confidence). These served as the post-measures for
this study. The second test was given during the following (eighth) week of class.
One week after the second test, all students in the class, including the placebo group and non-participants,
were informed about the study tips and how to access them on the website. This was to control for the potential
ethical problem of one group receiving a favored treatment and to assist interested students in preparing for the final
237
two tests and term project.
Results
There were two sets of analyses. The first was based on the first independent variable which was message
type. Based on the repeated measures analyses, there were no significant differences among the three message type
groups with respect to study time, interest, relevance, or test scores. With respect to confidence, there were no
significant differences between groups, but the mean confidence level decreased significantly, F(1,76)= 6.80,
p=.011.
In the study tips usage groups, there were several significant differences between the participants who
opened the study tips attachments and those who did not. First, with regard to study time, there was a significant
interaction effect, F(1,25)= 8.04, p=.009, such that those who opened the study times increased while those who did
not open them decreased in time spend studying.
238
Study tips
Opened
Did not open
16.00
Estimated Marginal Means
14.00
12.00
10.00
1 2
Study time
There were no differences between the two groups in interest or relevance, but there was a difference in
confidence. There was a significant interaction, F(1,38)= 3.43, p=.072. Those who opened the study tips scored
lower on the pre-measure than those who did not open them, but their confidence increased slightly on the post-
measure while the scores of those who did not open the study tips decreased dramatically (Figure 2).
Opened
Did not
3.90 open
3.80
3.70
3.60
1 2
Confidence
Figure 2 Confidence differences between study tip groups
There was also a significant difference in test scores, F(1,38)= 9.00, p=.005, in that both groups scored
higher on Test 2 than Test 1. The interaction was not significant even though the magnitude of improvement in the
“opened study tips” group was greater than the “did not open” group (Figure 3).
239
Study Tips
9.50
Yes
No
Estimated Marginal Means
9.00
8.50
8.00
7.50
1 2
Test
Discussion
Results indicate that the combined set of motivational and volitional strategies contributed to improving
students’ study habits, attitudes toward the course, and learning performance. This indication is supported by the
results that students in the treatment group who opened the study tips had spent more time studying, had increased
confidence, and performed better in the test than students who didn’t open the study tips. Although confidence
dropped overall in the three message-type groups, this is probably due to the fact that people were overconfident at
first and it was not surprising that the confidence would drop after taking the first test and discovering that their
grades were not as high as they had, perhaps, hoped. According to the instructor, some students choose to take this
course for one of their general education requirements because they expect that it will be an easy course, and maybe
they think it will be exciting like watching the action adventure movie, “Raiders of the Lost Arc,” which has a
strong archaeological theme. But, the students find that it is not easy and that it is filled with highly technical detail.
The first measure of confidence was taken before the first test when students just started this course and were,
apparently, over confident. The second measure was taken right before the second test when confidence would be
low due to the students’ experience of the first test results. Worthy of mention is those students who chose to open
the study tips have maintained and slightly increased their confidence. This further confirmed that the combined set
of motivational and volitional strategies can have a positive impact on maintaining students’ motivation.
In contrast to the expectations of this research, there were few differences among message-type treatment
groups concerning study habits, interest, confidence, relevance, and grades. One reason might be the limited
participation—relatively few number of students opened the study tips of containing combined sets of motivational
and volitional strategies. The limited participation can be due to several reasons: 1) Students got confused about
various emails—email from instructors, other people, etc. 2) Some students won’t open the attachment if it’s not
important or crucial to them. 3) Some students may be afraid of opening the attachment because of bugs or virus.
Future research could consider getting more control over the situation—a situation where it is assured that students
will receive and examine the sets of study tips. Future research could also improve the implementation of the
treatments (sometimes there were time gaps that were too long). Besides, future research should adopt better ways
to deliver study tips and messages rather than simply sending them through emails. Even so, the results of this study
support the feasibility and effectiveness of incorporating combinations of motivational and volitional messages into
packages of information that are distributed in the form of “motivational messages.”
References
Azevedo, R., & Cromley, J. G. (2004). Does training on self-regulated learning facilitate student' learning with
hypermedia? Journal of Educational Psychology, 96(3), 523-535.
Bannert, M. (2004). Designing Metacognitive Support for Hypermedia Learning. In H. M. Niegemann, R. Brünken
& D. Leutner (Eds.), Instructional Design for Multimedia Learning. Proceedings of the EARLI SIG 6
240
Biannual Workshop 2002 in Erfurt. Münster: Waxmann.
Corno, L. (2001). Volitional Aspects of Self-Regulated Learning. In B. J. Zimmerman & D. H. Schunk (Eds.), Self-
Regulated Learning and Academic Achievement. Theoretical Perspectives (Second Edition) (pp. 191-226).
Mahawah, N.J.: Erlbaum.
Corno, L., & Kanfer, R. (1993). The role of volition in learning and performance. Review of Research in Education,
19, 301-341.
Gollwitzer, P. M. (1999). Implementation Intentions. Strong Effects of Simple Plans. American Psychologist, 54(7),
493-503.
Gollwitzer, P. M., & Schaal, B. (2001). How Goals and Plans Affect Action. In J. M. Collis & S. Messick (Eds.),
Intelligence and Personality. Bridging the Gap in Theory and Measurement (pp. 139-161). Mahwah, N.J.:
Erlbaum.
James, W. (1890). The principles of psychology (Vol. 2). New York: Henry Holt.
Keller, J. M. (1987). Development and use of the ARCS model of motivational design. Journal of Instructional
Development, 10(3), 2 - 10.
Keller, J. M. (2004). A predictive model of motivation, volition, and multimedia learning. In Proceedings of the
International Symposium & Conference, Educational Media in Schools (pp. 9-19): Osaka, Japan: Kansai
University.
Keller, J. M., & Subhiyah, R. (1993). Course interest survey. Tallahassee, FL: Instructional Systems Program,
Florida State University.
Kuhl, J. (1984). Volitional aspects of achievement motivation and learned helplessness: Toward a comprehensive
theory of action control. In B. A. Maher & W. B. Maher (Eds.), Progress in Experimental Personality
Research (pp. 101-171). Orlando: Academic Press.
Kuhl, J. (1985). Volitional mediators of cognitive-behavior-consistency; self-regulatory processes and action versus
state orientation. In J. Kuhl & J. Beckmann (Eds.), Action control. From cognition to behavior (pp. 101-
128). Berlin: Springer.
Kuhl, J. (1987). Action control: The maintenance of motivational states. In F. Halisch & J. Kuhl (Eds.), Motivation,
Intention and Volition (pp. 279-291). Berlin: Springer.
Patterson, C. J., & Mischel, W. (1976). Effects of tempation-inhibiting and task-facilitating plans of self-control.
Journal of Personality and Social Psychology, 33, 209-217.
Pintrich, P. R., & Garcia, T. (1994). Self-regulated learning in college students: Knowledge, strategies, and
motivation. In C. E. Weinstein (Ed.), Student motivation, cognition, and learning. Hillsdale, NJ: Erlbaum.
Pintrich, P. R., & Schunk, D. H. (2002). Motivation in Education. Theory, Research, and Applications (Vol. 2).
Upper Saddle River, N.J.: Merrill Prentice Hall.
Visser, J., & Keller, J. M. (1990). The clinical use of motivational messages: an inquiry into the validity of the
ARCS model of motivational design. Instructional Science, 19, 467-500.
Wolters, C. (1998). Self-regulated learning and college student's regulation of motivation. Journal of Educational
Psychology, 90, 224-235.
Zimmerman, B. J. (1998a). Academic Studying and the Development of Personal Skill: A Self-Regulatory
Perspecitve. Educational Psychologist, 33(2/3), 73-86.
Zimmerman, B. J. (1998b). Developing Self-Fulfilling Cycles of Academic Regulation: An Analysis of Exemplary
Instructional Models. In D. H. Schunk & B. J. Zimmerman (Eds.), Self-Regulated Learning. From
Teaching to Self-Reflective Practice (pp. 1-19). New York: Guilford Press.
241
The Effects of Critical Reflection Thinking On Learner's Metacognition
Aaron Kim
Florida State University
Abstract
The purpose of this study is to examine the effects of critical thinking prompts on learners’ metacognition.
In this study, a critical thinking prompt is defined as a learner support that stimulates reflection upon
comprehension, reading process and strategies, purpose of reading, progress of learning, and other aspects of
reading comprehension. Critical thinking prompts used in this study are embedded questions in the text for critical
thinking. Metacognition of the students is defined as the use and awareness of reading strategies.
242
reflect on their understanding and learning process? Or more generally, what kind of metacognitive support for
reflection can we use?
One promising way of facilitating reflection on a learner’s understanding and learning processes is
providing embedded prompts or cues that help the learners do critical thinking. Critical thinking is generally
defined reasoning and judgment that involves processes of interpretation, analysis, evaluation, explanation, and
inference of a given matter (Facione,1990; Kuiper, 2003). Critical thinking that leads to reflection is one of the
strategies for deeper understanding and meaningful learning along with planning, monitoring, and self-regulation
(Pintrich & Schrauben, 1992).
Actually, in some studies, critical thinking prompts were identified to be promoting reflection of learners.
For example, a study investigated patterns and levels of reflection of college students who had different design
features and critical thinking learner supports over two semesters(Whipp, 2003). Based on data analysis from e-mail
discussion threads, student survey, and portfolio papers, researchers found that the students, who received a number
of critical thinking learner supports during the second semester, wrote e-mails and assignment papers at higher
levels of reflection than the other group who did not have learner supports. And then, the researcher identified four
important supports for higher level of reflection: tailored questioning, general questioning, use of critical reading,
and threads of online discussion at higher level of reflection (Whipp, 2003). Tailored questioning and general
questioning served here as the critical thinking prompts so that the students had more time for reflection on various
aspects of their learning.
Thus, in summary, critical thinking allows learners more opportunities to reflect upon their understanding,
to monitor the cognitive activities, and to choose and apply proper skills, which are clearly metacognitive.
Therefore, providing critical thinking prompts to the learners may stimulate the reflection, and in turn lead learners
to go through metacognitive activities if the learners reflect on their quality of understanding, learning process, and
other related aspects. The metacognitive activities are what is referred to as metacognitive experience in earlier
studies. Through these activities, learners can increase their metacognitive awareness and develop metacognition
over time (Flavell, 1987).
Even though many reported that critical thinking is closely related to reflection and metacognition and
speculated that critical thinking supports promote metacognition, there are only limited number of empirical
research on the association between critical thinking and metacognition, especially experimental research. In other
words, most studies in the past did not experimentally examine the association between critical thinking and
metacognition. Therefore, it may be valuable to conduct an experimental study focusing on how critical thinking
affects metacognition.
Thus, the purpose of this study is to examine the effects of critical thinking prompts on learner’s
metacognition. In this study, critical thinking prompt is defined as a learner support that stimulates reflection upon
comprehension, reading process and strategies, purpose of reading, progress of learning, and other aspects of reading
comprehension. Critical thinking prompts are embedded questions for critical thinking. Metacognition of the
students in this study is defined as the use and awareness of reading strategies. It was measured by a self-report
instrument, the Metacognitive Awareness of Reading Strategy Inventory, developed and validated by Mokhtari and
Reichard (2002). The inventory was administered after the treatment to assess the students’ actual use of reading
strategies and awareness of them.
In this study, it was expected that critical thinking learner supports would stimulate the students’
metacognitive activities and consequently promote metacognition. More specifically, it was expected that
metacognition activated by the students during the instruction would be greater among the students who studied the
instructional material with the critical thinking prompts embedded in it. Since the critical thinking prompts are
tailored questions and general questions regarding the important points of the given materials, they are expected to
provide more opportunities for reflection. As stated earlier, reflection upon various aspects regarding reading
processes and comprehension are metacognitive activities or experience and are promoting awareness of
metacognition and developing metacognition (Flavell, 1987; Lin, 2001; Lin et al., 1999; Schraw, 1998). Since
metacognition is defined as the use and awareness of reading strategies in this study, critical thinking prompts were
expected to help the students use more reading strategies and be aware of strategies that they used during the
instruction.
Method
Participants
Participants were 47 college freshmen in a southeastern university. 39 of the participants were female and
all of them are 19 and 20 years old. Participants’ general ability for academic tasks was considered above the
average of general population. The course the participants were enrolled in was an introductory American history,
243
which was a required one. The participants did not know they were involved in an experimental process.
Independent Variables
The independent variable used for this study was the learner support. The level of independent variable
was simply whether the learner support is absent or present. The learner support employed was critical thinking
prompt, which was provided to the students as the learner support. Treatment group received the instructional
material that included embedded critical thinking prompts within the text, while the control group studied the
material that had only text. The prompts were placed within the text, where the relevant contents were described
and discussed. Each prompts was put in a rectangular box, and bold font used in order to distinguish the prompt
from the content text. The control group had only stories without any learner support.
Dependent Measures
There is one dependent variable in this study, which is metacognition. As defined earlier, metacognition is
the strategies that the students used during the instruction and awareness of those strategies. As a result of the
learner supports, the students were expected to use more strategies and to be aware of them. For example, students
may underline or circle the key information in the text in response to a critical thinking prompt, which is a question
about the main points of the article. Students also may go back and forth in the text to articulate differences among
ideas in it because of a given prompt. Thus, this current study measured what kind of strategies that the students
used during the instruction and the awareness of those strategies by using a self-report survey. The survey is
developed and validated by a study of Mokhtari and Reichard (2002). It is called the Metacognitive Awareness of
Reading Strategy Inventory. The inventory was administered after the treatment to assess the students’ actual use of
reading strategies and awareness of them. It originally consists of 30 items, which are categorized into 3 strategies.
Three strategy categories are global reading (13 items), problem-solving (8 items), and support reading strategies (9
items). However, some of the items are not relevant to this current research. For example, there is an original item
“I use tables, figures, and pictures in text to increase my understanding", which is not relevant in this study because
the material did not have any tables, figures, and pictures. Thus, those irrelevant items were deleted from the
survey, and final survey had 28 items. In order to ensure the accuracy of the responses and to specify the responses
to the instructional material they studied for the last three classes, the tense of the each item had been changed to the
past tense. In addition to that, the instruction for the survey specified the instructional material they studied as a
focus of the survey. The internal consistency reliability for the measurement was determined to be 0.79.
Procedures
The students participated in the experiments for three class meetings. Each of the class meetings was about
90 minutes and two days apart from another class meeting. Participants studied the instructional material for about
50-55 minutes each time and joined the regular class lecture. After the last class, the reading strategy inventory was
handed out as homework, which is due by next class meeting two days later. They completed survey and scoring
rubric, and turned in. The directions for the survey specifically mentioned about what you did when you read short
stories handed out during the last three classes, in order for the students to focus on the instructional materials used
for the experiment. The instructor for the course distributed randomly to the students two different material
packages that had been numbered serially. Odd numbered material was for control group and even numbered on
was for treatment group.
Result
The dependent variable in this study was the learner’s metacognition, which was defined as the use and the
awareness of various reading strategies. It was measured by a 28-item self-report questionnaire, the Metacognitive
Awareness of Reading Strategy Inventory, administered at the end of the last class of the three classes, in which the
244
experiment was conducted. Table 1 presents the means and standard deviations for both control and treatment group
on the questionnaire. Reliability of the instrument was .91 standardized item alpha. Preliminary analysis of the data
did not indicate any serious violation of the normality and equal variance. To analyze the data, simple t-test
comparing means was employed. With alpha set at .05, and with 21 for the control group and 25 for the treatment
group, the probability of detecting a small difference between means was .65. Results of the t-test indicated that
there was an insignificant main effect for critical thinking prompts on metacognition, t(44) = -1.8, p = .08.
Discussion
The purpose of this study is to examine the effects of critical thinking prompts on learner’s metacognition.
There was no significant difference in mean of the scores of the questionnaire between the control and treatment
groups. The reason may be that the material was too short and easy to read and understand. It described stories
about actual people or interesting events that some people experienced, which may be interesting to the readers or
easy to follow. For example, one story was about a woman who grew up in small Caribbean island and what she has
experienced coming to the U.S. Other articles used were also comprised with brief arguments and thoughts. Thus,
the overall difficulty of reading those stories was not so high that students had to use various strategies for reading
and following the focus and the theme of the stories.
Another possible reason for the results may be that some items are not directly relevant to the actual
strategies that they used or would use if necessary. Particularly, five items were identified irrelevant based on the
score on the questionnaire and the nature of the instructional materials. They are: “I take notes while reading to help
me understand what I read.”; “When text becomes difficult, I read aloud to help me understand what I read.”; “I
discuss what I read with others to check my understanding.”; “I underline or circle information in the text to help me
remember it.”; “I use reference materials such as dictionaries to help me understand what I read.” In other words, in
terms of the nature of the tasks, the students did not need to use some of the strategies. For example, the short
stories were no technical documents or textbooks, which may require concentration and higher recall rate of specific
information. Stories used in the material were just short stories aiming for conveying the ideas about diversity and
the themes of the articles, as stated earlier. Thus, the students did not have to “take notes” or “circle or underline
information in the text” to point out some key information as they did while preparing for an exam. Moreover, the
students did not need to “discuss with others” or “read aloud” because they were not allowed to do such actions as a
rule or courtesy in classroom.
As a result, the scores for those five items are the lowest of all in both control and treatment group.
Additionally, it is suspected that the insignificant result stems from various sources such as the lack of the treatment
power, the design of the instructional materials, the prior level of metacognition of the students, and so forth.
Despite the insignificant results, the review of the scatterplots and statistical test indicated noticeable
difference between two groups. The most observations for the control group are distributed wider and lower than
the treatment group. Without one particular observation, there is even more clear difference. The group that had
critical thinking prompts reported higher questionnaire scores in total, and higher maximum and minimum scores
within similar variance than the control group that had only text. The outstanding observation scored 17 points
higher than the maximum score of the control group while the difference between the lowest and the second lowest
score is only 7 points. It is 2.35 standard residual, which is slightly below the criteria of outlier. Without it, there is
significant difference between the groups t(43) = -2.29, p < .05, which could mean that there is the main effect for
critical thinking prompts on metacognition as hypothesized.
In summary, the overall results seem to indicate that there are expected effects of critical thinking prompts
on students’ metacognition as found in earlier studies (Whipp 2003). Therefore, the implication of this study for
many different educational and training settings is the role of the critical thinking prompt as a simple and easy-to-
apply learner support in various formats. Instructional designer and teacher can embed this type of learner supports
within the learning material as the first step he or she can easily take in order to promote and develop the learners’
reflection, metacognition, and possibly achievement. Especially with the current trends that there are growing needs
of self-study and of learning for one’s lifetime, it is important to utilize various learner supports in learning
245
materials.
References
Brown, A. L., Bransford, J. D., Ferrara, R. A., & Campione, J. C. (1983). Learning, remembering and
understanding. In I. P. H. Mussen (Ed.), Handbook of psychology (Vol. 3, pp. 77-166). New York: Wiley.
Flavell, J. H. (1979). Metacognition and Cognitive Monitoring: A New Area of Cognitive Development Inquiry.
American Psychologist, 34, 906-911.
Flavell, J. H. (1987). Speculation about the nature and development of metacognition. In F. E. Weinert & R. H.
Kluwe (Eds.), Metacognition, motivation, and understanding (pp. 21-29). Hillsdale, NJ: Lawrence Erlbaum
Associates.
Garner, R. (1987). Metacognition and Reading Comprehension. Norwood, NJ: Ablex Publishing Co.
Lin, X. (2001). Designing Metacognitive Activities. Educational Technology Research and Development, 49(2),
23-40.
Lin, X., Hmelo, C., Kinzer, C., & Secules, T. (1999). Designing Technology to Support Reflection. Educational
Technology Research and Development, 47(3), 43-62.
Mokhtari, K., & Reichard, C. A. (2002). Assessing students' metacognitive awareness of reading strategies. Journal
of Educational Psychology, 94(2), 249-259.
Pintrich, P. R., & Schrauben, B. (1992). Students' motivational beliefs and their cognition engagement in classroom
academic tasks. In D. Schunk & J. L. Meece (Eds.), Student Perceptions in the Classroom (pp. 149-184).
Hillsdale, NJ: Lawrence Erlbaum Associates.
Schoenfeld, A. (1989). Teaching Mathematical Thinking and Problem Solving. In L. B. Resnick & L. E. Klopfer
(Eds.), Toward a Thinking Curriculum: Current Cognitive Research. Alexandria, VA: Association for
Supervision and Curriculum Development.
Schraw, G. (1998). Promoting General Metacognitive Awareness. Instructional Science(26), 113-125.
Sternberg, R. J. (1985). Beyond IQ: A Triarchic Theory of Human Intelligence. New York: Cambridge University
Press.
Whipp, J. L. (2003). Scaffolding critical reflection in online discussions - Helping prospective teachers think deeply
about field experiences in urban schools. Journal of Teacher Education, 54(4), 321-333.
246
Using Motivational and Volitional Messages to Promote Undergraduate
Students’ Motivation, Study Habits and Achievement
ChanMin Kim
John M. Keller
Huei-Yu Chen
Florida State University
Abstract
This study investigated what kind of supportive information can be effective in improving the situation
where there were severe motivational challenges. Motivational and volitional messages were constructed based on a
integrated model of four theories and methods, which are Keller’s ARCS model (Keller 2004), Kuhl’s (1987) action
control theory, the Rubicon model of motivation and volition (Gollwitzer 1999), and Visser & Keller’s (1990)
strategy of motivational messages, and distributed via email with personal messages created based on audience
analysis to a large undergraduate class. In order to examine the effects of the messages on motivation, study habits
and achievement, the motivational and volitional messages were sent to thirteen students (Personal Message Group:
PMG) with personal messages and to seventy one students (Non Personal Message Group: NonPMG) without
personal messages. Results indicated that PMG showed more positively increased motivation, especially in regard
to confidence than NonPMG. With regard to achievement, the mean test scores of PMG jumped so that the initial
differences between the two groups significantly decreased. However, there was no difference between two groups
in study habits. These findings suggest that personal messages addressing specific individual problems raise the
positive effects of the motivational and volitional messages constructed based on the integrated model.
Introduction
One of the difficulties in motivating students in large undergraduate lecture classes is that it is difficult to
establish personal contact with them, or to make them feel that their individual needs, interests and goals are being
addressed by the instructor. One potential way of improving upon this situation would be to use the Internet as a
means of sending supportive information directly to each student. However, in order to do this, it is necessary to
determine what kind of supportive information to send.
In an attempt to investigate what kind of supportive information can be effective in improving this kind of
situation where there are threats to motivation are, this study constructed motivational and volitional messages based
on an integrated model of four theories and methods, which are Keller’s ARCS model (Keller 2004), Kuhl’s (1987)
action control theory, the Rubicon model of motivation and volition (Gollwitzer 1999), and Visser & Keller’s (1990)
strategy of motivational messages, and distributed the messages via email.
Specifically, one important feature of this study is the expansion of the ARCS model to include
motivational and volitional strategies. Recently, Keller (2004) has described the problems of sustaining learner
motivation and suggested that the ARCS model be expanded to include volitional concepts and strategies such as
those in Kuhl’s (1987) action control theory, and Gollwitzer’s (1999) theory of motivation and volition. In addition,
McCann and Turner (2004) recommend volitional strategies as a way of maintaining students’ motivation,
protecting against distractions, and developing their positive study habits.
Another important feature of this study is the process of creating those messages based on a systematic
motivational and volitional design process that includes audience analysis and guidelines for message development.
This process builds on one introduced by Visser & Keller (1990) in a situation where there were severe motivational
challenges. That class was small in size and conducted on-site where the participants were employed. The content of
the messages pertained to the four motivational categories defined within the ARCS model (attention, relevance,
confidence, and satisfaction) (Keller 1987) and there were three types of messages. The first type was preplanned
based on a-priori analyses of anticipated motivational problems, the second was sent to the whole class based on
unexpected events, and the third was individually distributed based on specific individual problems.
The setting of this study is different from Visser & Keller’s since the instructor has a general knowledge of
the motivational challenges faced by the students but she does not likely have a close relationship with the students
or personal knowledge of events in their lives that might adversely affect their studies, and also, she is not able to
personally distribute messages. In addition, the messages distributed via email are somewhat impersonal compared
to the previous study; however, considering the widespread use of this medium, students might view such messages
247
as a type of personal attention.
Therefore, it is necessary to determine whether a similar technique to Visser & Keller’s to address students’
motivational problems can be useful in a large undergraduate course; that is, it is necessary to examine whether
personal messages addressing specific individual problems in addition to the motivational and volitional messages
created to concern general motivational and volitional problems can be useful in the course.
Thus, the purpose of this study is to investigate whether the motivational and volitional messages
constructed based on the four theories and methods and distributed via email with personal messages created based
on audience analysis can be used to promote students’ motivation, study habits and achievement.
Research questions
In this study, the following research questions were addressed:
Can the motivational and volitional messages constructed based on the four theories and methods and Formatted: Bullets and Numbering
distributed via email with personal messages created based on audience analysis result in:
a positive effect on participants’ motivation?
a positive effect on participants’ study habits (study time)?
a positive effect on participants’ achievement (grade)?
Method
Participants
The sample consisted of 101 undergraduate students enrolled in an archeology course at a Southeastern
public university. The participation was voluntary so that 50 students among the total enrollment of 151 chose not to
participate. Some of the participants were interested in majoring in this area and others were taking it as a general
education requirement. The submission of logbooks answering survey questions was a required activity and they
received credit for class participation.
The participants were assigned to one of two groups: one (Personal Message Group: PMG) received the
motivational and volitional messages with personal messages, and the other (Non Personal Message Group:
NonPMG) received the motivational and volitional messages without personal messages. The participants who
indicated a low level of satisfaction with their grades in a survey following the second test in the course were
assigned to the former. Their satisfaction levels were extremely unsatisfied, moderately unsatisfied, or moderately
satisfied. The participants who indicated the high level of satisfaction with their grades of the most recent test in the
pre-survey were assigned to the latter. Their satisfaction levels were very satisfied or extremely satisfied.
Materials
Pre-survey. The pre-survey assessed two dependent variables: motivation with the course and study time.
In addition, it included a scale assessing the participants’ satisfaction with their grade of the most recent test.
Post-survey. The post-survey which was administered after the third test assessed the same dependent
variables in the pre-survey.
Motivational and volitional messages. The motivational and volitional messages (see Keller, Deimann &
Liu, 2005) distributed via email were constructed based on the integrated model of four theories and methods, which
were Keller’s ARCS model (Keller, 2004), Kuhl’s action control theory (Kuhl, 1987), Rubicon model of motivation
and volition (Gollwitzer, 1999), and Visser & Keller’s strategy of motivational messages (Visser & Keller, 1990). In
addition, the motivational and volitional messages sent to Personal Message Group (PMG) included the personal
messages created based on individual audience analysis (See Figure 1).
Measures
The following dependent variable measures were used:
Motivation with the course: three 5-point Likert scale item; interest, relevance and confidence (Reliability for Formatted: Bullets and Numbering
these scales as assessed by Cronbach’ alpha was >.70.)
Satisfaction with grade: one 5-point Likert scale items
Study habits (study time): the total study hours of the week before or after receiving the motivational and
volitional messages
Achievement (grade): twelve scales from A (12) to F (1)
Procedure
The pre-survey and the post-survey were conducted before and after sending the motivation and volitional
messages, respectively. Each survey took approximately five minutes to complete. The motivational and volitional
248
messages were sent to the Personal Message Group (PMG: n=30) with personal messages and to the Non Personal
Message Group (NonPMG: n=71) without personal messages between the pre-survey (after the second test of the
semester) and the post-survey (after the third test of the semester).
From: [email protected]
Attention: Calling the Sent: Wednesday, March 09, 2005 12:22 PM
student’s name raises her To: [email protected]
arousal and curiosity and Subject: [ANT3141-01.sp05_research]
directly relates this message
to her . Dear Jamie,
In the previous logbook you said that you were not completely satisfied with your grade on Test 2,
Relevance: Information based and I also noticed that you want to earn higher grades than you have earned so far. I have some
on individual audience analysis suggests, in the form of Study Tips and messages such as this one, that might help you raise your
personalizes this message and grades on the two remaining tests.
make it directly related to her.
Recently, a group of students in the class received these Study Tips in attachments, and a small
group of students used them. The grades went up for eighty-two percent of the students who used
the Tips, and their average improvement was two/thirds of a grade (for example, form a C to a B-).
Confidence: This part Some went up more, some less, but the evidence tells us that these Tips can be very helpful. In
encourages her to believe contrast, the overall class average on the second test was the same as the first one.
she can achieve her goal.
If you would like to take advantage of this opportunity, here is what to do.
1. Go to Blackboard (http:// campus.fsu.edu) and open the course titled: WORLD PREHISOTRU
ANNEX [ANT3141-01.sp05_research].
2. Click on “Study Tips” in the menu on the left.
3. Open the “Stages of Learning” file and read it carefully to get a better understanding of the
Volitional strategies: process that one goes through to establish effective learning habits.
This part informs that 4. Then, open the “Making a Plan that Works” file. This is an extremely important file! Read it
she needs strategies to carefully and make the kind of plan that is described there. Also, pay careful attention to the parts
control environment about how to deal with distractions and procrastination. And, notice that these tips require
and to make decision COMMITMENT and EFFORT from you. I can not, of course, guarantee that your grade will go
on maintaining active up, but you can have confidence that if you do the things that are described here and in other
strategies that I will tell you about, it is highly probable that you will increase your grade.
5. After you do this planning, I would appreciate it if you give me a quick reply to this message to
Confidence: This part let me know if you are finished.
encourages her to believe
she can achieve her goal. I realize that you might not even see this until after spring break. That is okay. There is still time to
use the Study Tips effectively to improve your grade. After a few days, or as soon as you reply to
his message, I will send another one that describes the next set of steps to follow.
If you do not want me to send you any more messages, just send me a reply in which you ask me
to stop.
Sincerely,
John Keller
Figure 1. One example email with personal message (The student’s name is a pseudonym.)
249
Results
Following (Table 1) are the means for each condition. Following Table 1 is a presentation of results for the
three research questions.
Table 1. Means and standard deviations for the dependent variables (and SD)
1. Can the motivational and volitional messages constructed based on the four theories and methods and
distributed via email with personal messages created based on audience analysis result in a positive effect on
participants’ motivation?
The MANOVA for motivation indicated that there was an overall effect of the motivational and volitional
messages on the motivation, Wilk’s Lambda = .913, F(1, 99) = 3.089, p < .05. Univariate results revealed a main
effect of the messages on motivation, where those received the messages with personal messages reported
significantly more confidence (M = 2.56, SD = 0.71) compared to those who received the messages without personal
messages (M = 2.03, SD = 0.90), F(1, 99) = 8.051, MSE = 5.787 , p < .05, η2 = .075.
2. Can the motivational and volitional messages constructed based on the four theories and methods and
distributed via email with personal messages created based on audience analysis result in a positive effect on
participants’ study habits (study time)?
The ANOVA for study time revealed that there was no significant difference in study habits between those
who received the messages with personal messages (M = 3.61, SD = 1.50) and those who received the messages
without personal messages (M = 3.90, SD = 1.91), F(1, 99) = .506, MSE = 1.649 , p > .05.
3. Can the motivational and volitional messages constructed based on the four theories and methods and
distributed via email with personal messages created based on audience analysis result in a positive effect on
participants’ achievement (grade)?
A 2 × 2 repeated measures ANOVA on the data on the tests before and after messages were sent showed a
significant interaction between the two factors of time and intervention methods [F (1, 99) = 5.355, MSE = 18.883, p
< .05, η2 = .051]. This result indicated that students’ test scores depended on time (before personal message and after
personal message) according as they were in PMG or Non PMG. In other words, PMG and NonPMG’s test scores
were significantly different before personal message and after personal message such that the means for NonPMG
were always superior to PMG (See Table 1). But, the means of the two groups moved closer together after the
personal messages. The significant interaction occurred because the mean of the NonPMG grades decreased while
the mean of the PMG group increased. (See Figure 2).
250
Estim ated M arginalM eans
10.0
9.5
9.0
TestScores
8.5
8.0
7.5 PM G
7.0 NonPM G
Before personalm sg After personalm sg
TIM E
Discussion
The results indicate that participants who received the motivational and volitional messages with personal
messages showed more positively increased motivation, especially in regard to confidence, and achievement than
those who received the motivational and volitional messages without personal messages. These findings suggest that
the motivational and volitional messages constructed based on the four theories and methods and distributed via
email with personal messages addressing specific individual problems be useful support for improving the situation
where there are threats to motivation.
Increased motivation might have resulted from the personal messages where the words and sentences
concerning participants’ attention, relevance and confidence were embedded in (See Figure 1). Particularly, more
increased confidence than interest or relevance also might be explained by the fact that there were more parts
facilitating confidence than the others.
With regard to achievement, the initial differences in test scores between the two groups significantly
decreased (See Figure 3). This pattern of development of achievement shows an ideal direction that is often hoped to
be observed by researchers when conducting interventions that are purposely designed to enhance motivation.
Contrary to what were found in motivation and achievement, there was no positive effect of the personal
messages with embedded motivational and volitional elements on study habits. Perhaps, the messages did not
significantly impact study habits because the participants did not have enough time to utilize the volitional strategies
suggested. In other words, the motivational and volitional messages with personal messages were given only for one
month between the two tests. During this time period, the participants might not have successfully transformed from
the stage of “commitment” to the stage of “formation of implementation intention” so that they might not have been
ready to move from “pre-actional phase” to “actional phase” (Gollwitzer 1999). Positive effects might have been
found in study habits if the personal messages had been constantly provided throughout the whole semester.
In conclusion, future research could include long-term use of personal messages along with the
motivational and volitional messages and particularly determine if students’ study habits are positively improved.
Future research could also consider that the instructor of a course construct and send personal messages with
assistance from the researcher who actually designs the motivational and volitional messages. In that case, the
findings might be improved because students’ would perceive those messages as being more personal because they
were sent by the instructor whom they interact with in class rather than a researcher whom they are not familiar
with. However, this study gives preliminary evidence that personal messages created according to individual
audience analysis can raise the positive effects of the motivational and volitional messages constructed based on the
four theories and methods.
References
Gollwitzer, P. M. (1999). Implementation Intentions. Strong Effects of Simple Plans. American Psychologist, 54(7),
493-503.
Keller, J. M., Deimann, M. & Liu, Z. (2005). Effects of integrated motivational and volitional tactics on study
habits, attitudes, and performance, Paper presented at AECT Conference, Orlando, FL.
Keller, J. M. (2004). A predictive model of motivation, volition, and multimedia learning, International Symposium
& Conference, Educational Media in Schools (pp. 9-19): Osaka, Japan: Kansai University.
Keller, J. M. (1987). Development and use of the ARCS model of motivational design. Journal of Instructional
Development, 10(3), 2 - 10.
251
Kuhl, J. (1987). Action control: The maintenance of motivational states. In F. Halisch & J. Kuhl (Eds.), Motivation,
Intention and Volition (pp. 279-291). Berlin: Springer.
McCann, E. J., & Turner, J. E. (2004). Increasing student learning through volitional control. Teachers College
Record, 106(9), 1695-1714.
Visser, J., & Keller, J. M. (1990). The Clinical Use of Motivational Messages - an Inquiry into the Validity of the
Arcs Model of Motivational Design. Instructional Science, 19(6), 467-500.
Gollwitzer, P. M. (1999). "Implementation Intentions. Strong Effects of Simple Plans." American Psychologist
54(7): 493-503.
Keller, J. M. (1987). "Development and use of the ARCS model of motivational design." Journal of Instructional
Development 10(3): 2 - 10.
Keller, J. M. (2004). A predictive model of motivation, volition,and multimedia learning. International
Symposium & Conference, Educational Media in Schools, Osaka, Japan: Kansai University: 9-19.
Kuhl, J. (1987). Action control: The maintenance of motivational states. Motivation, Intention and Volition. J. Kuhl.
Berlin, Springer: 279-291.
McCann, E. J. and J. E. Turner (2004). "Increasing student learning through volitional control." Teachers College
Record 106(9): 1695-1714.
Visser, J. and J. M. Keller (1990). "The Clinical Use of Motivational Messages - an Inquiry into the Validity of the
Arcs Model of Motivational Design." Instructional Science 19(6): 467-500.
252
The Relationship Between Preservice Teachers’ Perceptions of Faculty
Modeling of Computer-Based Technology and Their Intent to Use Computer-
Based Technology
Kioh Kim
Landra Rezabek
Guy Westhoff
John Cochenour
University of Wyoming
Abstract
Based on Bandura’s (1997) social learning theory, the purpose of this study is to identify the relationship of
preservice teachers’ perceptions of faculty modeling of computer-based technology and preservice teachers’ intent
of using computer-based technology in educational settings. There were 92 participants in this study; they were
enrolled in “Teaching with Microcomputers” class at a major university in Rocky Mountains.
Two survey instruments were used in this study. The first instrument was Preservice Teachers’ Perceptions
of Faculty Modeling Survey (PTPFMS). The second instrument was Intent to Use Computer-based Technology
Survey (ITUCTS). The results showed that preservice teachers’ perception of faculty modeling of computer-based
technology significantly affected their intent to use computer-based technology; results were similar for the use
dimension and its sub dimensions, but on the dimension of role of technology and its sub dimensions the interaction
was insignificant. The paper concludes by stating the limitations and implications of this study.
Introduction
Will the trilogy of Matrix come true? We have not even completed the first decade of the 21st century and
advancements in computer-based technology are so great that we rely on it more than any other species on this
planet. Every sword has two faces, one good another bad. One good application of computer-based technology is
within educational settings. Using computer-based technology in educational settings helps students in their learning
(Sahin, 2003; Stinson, 2003; Whetstone, & Carr-Chellman, 2001). There are studies that indicate learners have
positive attitudes towards using technologies in their classroom (Kurubacak,& Baptiste, 2002; Lee, 1996; Norby,
2002; Okinaka, 1992). In addition, teachers also improve their instruction by using a variety of technology resources
such as the Internet, multimedia CD-ROMs, audio and graphics (Jao, 2001). There is evidence that suggests
teaching with technology provides more benefits for both teachers and students than teaching without any
technology.
There has been a scarcity of researchers exploring the ways in which preservice teachers can be taught to
effectively integrate computer-based technology within their instruction. According to the National Center for
Education Statistics (2000), teacher preparation for technology integration is minimal, and in 1999 most teachers
reported feeling less than well prepared to use computers and the Internet for instruction. Thus, an appeal to amplify
attention to this topic in teacher preparation programs has been issued by numerous organizations including the
International Reading Association (2002), the National Council for the Accreditation of Teacher Education (2004),
and the U.S. Department of Education (1996).
“To realize any vision of smarter schooling by using technology… college education must prepare teachers
to use the technology. Adequate teacher preparation is probably the most important determinant of success”
(Hancock, & Betts, 1994, p. 29). To effectively integrate computer-based technology in their teaching practice, it is
pertinent that prospective teachers develop appropriate teaching styles which incorporate computers to impact
student learning. Teaching with computers requires a shift from the traditional teaching practice. “Technology
affects the way teachers teach, students learn, and administrators operate. Roles and teaching and learning strategies
are changing because technology fosters the use of more student-centered learning strategies” (Norum, Grabinger, &
Duffield, 1999, p. 189).
Teacher’s attitudes toward the use of technology can significantly affect their students’ opportunities to
learn about technology (Norby, 2002; Okinaka, 1992). In order to help K-12 students, training preservice teachers is
the most direct and cost-effective way (Fasion, 1996). Universities and colleges are the places to train preservice
teachers to comprehensively integrate instructional technology into their future classroom instruction. It is necessary
for preservice teachers to be trained using instructional technology so that they can use the technology skills and be
confident in using technology in their classroom as classroom teachers. There is a great concern about the
253
prospective teachers’ perception of the role of the computer in the learning process.
The literature shows that there is a need for better training to preservice teachers to integrate computer-
based technology while they teach. Can this lack in training be fulfilled by proper modeling from faculty of
preservice teachers? This study will explore the relationship between preservice teachers’ perceptions of faculty
modeling in the use of computer-based technology, and preservice teachers’ perceptions of their intent toward using
computer-based technology when they become teachers.
This research is based on social learning theory. The social learning theory originated by Albert Bandura
(1977) emphasizes that learners learn by what they observe from modeling, attitudes, and emotional reactions of
their teachers. Therefore, this research argues that preservice teachers are going to use computer-based technology in
ways similar to the ways their college/university instructors’ modeled computer-based technology when they
become teachers in the future. Many researchers mentioned that technology must be modeled by college/university
faculty to produce new inservice teachers to use technology properly (Cassady & Pavlechko, 2000; Duhaney, 2001,
Krueger, Hansen, & Smaldino, 2000; Laffey & Musser, 1998; Luke, Moore, & Sawyer, 1998; Persichitte,
Caffarella, & Tharp, 1999; Schrum & Dehoney, 1998; Stetson & Bagwell, 1999; Wetzel, Zambo, & Buss, 1996;
Yidirim, 2000).
There is research literature that identified that many professors use computers in their classroom to teach
(Carlson & Gooden, 1999; Frey & Birnbaum, 2002; Nelson, 2004; Simmons & Macchia, 2003). The computer-
based technologies that professors use include word processing, database, spreadsheet, desktop publishing,
presentation software, World Wide Web, and email. All the computer-based technologies mentioned above should
be used in K-12 schools by teachers (Nelson, 2004). In order for teachers to use computer-based technology
effectively in their classroom, preservice teachers should be trained in how to use computer-based technology while
they are in college/university courses. These courses provide a model of what computer-based technology their
college/university instructors used within their teaching. It is these models that preservice teachers use when they
become teachers in the future.
254
The instructional modeling done by faculty provides the foundation from which preservice teachers use these same
or similar teaching models when they become teachers (Lever-Duffy, McDonald, & Mizell, 2005). In order for
preservice teachers to be comfortable in using computer-based technology as future inservice teachers, university
and college instructors should model computer-based technology in their teaching.
Current research identifies that “good technology mentoring is only achieved through role modeling,
ongoing evaluation, constructive criticism, and coaching” (Carlson & Gooden, 1999, p. 12). In another case,
teachers modeled the use of PowerPoint and the Internet through a Preparing Teachers to Use Tomorrow’s
Technology (PT3) grant (Simmons & Macchia, 2003). The preservice teachers who saw professors modeling
PowerPoint and the Internet are now making the effort to utilize various instructional technologies to support class
projects within their classrooms (Simmons & Macchia, 2003).
In one case, K-12 teachers with less experience in using technology in their own teaching began to use
technology after observing more experienced teachers use technology (Mills & Tincher, 2002). In another study,
modeling technology as a professional development model in technology integration showed that there were
changes in preservice teacher beliefs and practices (Ross, Ertmer, & Johnson, 2001). Therefore, modeling the use of
many types of hardware and software is the primary method for modeling technology use for preservice teachers.
Research Question
Given this research problem, the guiding question of the study is: Do preservice teachers’ perceptions of
faculty modeling in the use of computer-based technology have any relationship with the preservice teachers’
perceptions of their intent toward using computer-based technology?
Based on the literature review, it is hypothesized that subjects’ scores on intent to use computer-based
technology survey can be predicted by their scores on preservice teachers’ perceptions of faculty modeling.
Methodology
This research aims to identify the relation between preservice teachers’ perceptions of faculty modeling of
using computer-based technology and preservice teachers’ intent to use computer-based technology when they
become teachers. In order to collect data from the participants quantitative procedures were used. This section
includes a description of the sample, pilot study, data collection procedures, instrumentation, and data analysis
methods.
Sample Description
The participants in this study were preservice teachers who were enrolled in “Teaching with
Microcomputers” class at a major university in Rocky Mountains.
The course has five sections which have a total of 100 students. The course is a required instructional
255
technology courses for education majoring students. Since the participants are taking the course on campus, the
researcher requested the instructors for their permission and then researcher administered the survey for the research
in each classroom for all five sections. Overall 92 students participated in the study, out of which 62 were females
and 30 were males, and 43 students had elementary education and 49 students had secondary education as their
major. The age of participants was between 18 years and 62 years.
Procedure
Data were collected in “Teaching with Microcomputers” course which was taught by three different
instructors. All three instructors of the course allowed the researcher to be in their classrooms for collecting data
from the students enrolled in the course. The researcher distributed the questionnaires to each section during the
same week of the semester. The participation of subjects was voluntary in nature.
Instruments
For the study, two survey instruments were utilized to collect the data from the participants. They are
described as follows:
a) Preservice Teachers’ Perceptions of Faculty Modeling Survey (PTPFMS). The first instrument to be
used was the Preservice Teachers’ Perceptions of Faculty Modeling Survey (PTPFMS). This PTPFMS was used to
measure preservice teachers’ perceptions of their university instructors’ modeling of using computer-based
technology in their classroom. This instrument was created by the researcher. In the pilot study the overall reliability
of PTPFM was found to be 0.92. The PTPFMS instrument used a Likert scale from (1) Never to (5) Always and
consisted of 24 questions divided into two main sections: Use of computer based technology and Role of instructor.
The two sections are each divided into two focuses with six questions being student-centered and six teacher-
centered. PTPFMS includes three demographic question items pertaining to the participants’ gender, age and major.
Major includes two categories; elementary education and secondary education.
b) Intent to Use Computer-based Technology Survey (ITUCTS): The second instrument was the Intent to
Use Computer-based Technology Survey (ITUCTS). ITUCTS was adopted from the writings of Bichelmeyer,
Reinhart, and Monson (1998) and Wang (2001).
The ITUCTS instrument is divided into two sections, each section has 12 questions. The first section
addresses the preservice teachers’ perceptions of their future role in a classroom equipped with computer-based
technology (Role). Role of the teacher in the classroom was defined as the manner or style in which the teacher
engages during classroom instruction, having a spectrum from, the teacher as an authority figure (Teacher-Centered
Role) to the teacher as a learning facilitator (Student-Centered Role). The second section addresses the preservice
teachers’ perceptions of how they will use computer-based technology specifically when placed in a computer-based
technology enhanced classroom (Use). Use of computer-based technologies in the classroom is defined as either the
use of computer-based technology by the students for learning activities (Student-Centered Use) or use of computer-
based technology by the teacher in ways that enable the teacher to more easily manage his or her classroom and
instruction (Teacher-Centered Use). Both the sections used a Likert scale from (1) Never to (5) Frequently with 12
questions in each section.
The reliability of the section measuring teacher-centered role is .94, the section measuring student-centered
role is .93, the section measuring teacher-centered computer use is .86, and the section measuring student-centered
computer use is .93 (Wang, 2001). The overall reliability of this questionnaire in this study was found to be .83 and
the reliabilities on the sub-scales were found to be similar to the study of Wang (2001).
Data Analysis
Data from PTPFMS and ITUCTS were organized in SPSS 11.5 statistical software to analyze. This study
used regression analysis to determine the relationship between PTPFMS and four dimensions of ITUCTS.
Independent variables were four dimensions of preservice teachers’ perceptions of faculty modeling of using
computer-based technology, gender, age and major. Dependent variables were four dimensions of preservice
teachers’ intent to use computer-based technology survey. The analysis was also conducted on overall scores of
preservice teachers’ perceptions of faculty modeling of using computer-based technology and preservice teachers’
intent to use computer-based technology survey.
In the pilot study, none of the interactions were significant at 0.05 level. Therefore, in order to have more
significant interactions, 0.10 level was used while analyzing the results in the main study. But 0.05 level was also
used for results significant at that level.
256
Results
The table below shows the means of 92 participants on all the four dimensions of Preservice Teachers’
Perceptions of Faculty Modeling and four dimensions of Intent to Use Computer-based Technology (Table 1).
Table 1 Means of participants on all the four dimensions of PTPFM and ITUCT
PTPFM ITUCT
Teacher-centered Student-centered Teacher-centered Student-centered
Role Use Role Use Role Use Role Use
16.84 19.82 16.28 18.11 23.03 23.86 21.38 19.52
Further regression analysis is conducted to evaluate the relationship between four dimensions of Preservice
Teachers’ Perceptions of Faculty Modeling with their corresponding dimension on Intent to Use Computer-based
Technology.
Relation between Preservice Teachers’ Perceptions of Faculty Modeling and Intent to Use Computer-based
Technology Analysis of data showed that overall scores on Preservice Teachers’ Perception of Faculty Modeling of
Computer-based Technology Survey significantly predicted subject’s overall score on Intent to Use Computer-based
Technology (Table 2).
Table 2 Relation between Preservice Teachers’ Perceptions of Faculty Modeling and Intent to Use Computer-based
Technology
Analysis of the best fitting line when data were entered graphically showed that as subjects’ score on Preservice
Teachers’ Perceptions of Faculty Modeling Survey increased, their score on Intent to Use Computer-based
Technology Survey also increased (Figure 1).
Relation between Preservice Teachers’ Perceptions of Faculty Modeling (Use) and Intent to Use Computer-based
Technology (Use)
Analysis of data showed that overall scores on use of Preservice Teachers’ Perception of Faculty Modeling
of Computer-based Technology Survey significantly predicted subject’s overall score on use of Intent to Use
Computer-based Technology (Table 3).
Table 3 Relation between Preservice Teachers’ Perceptions of Faculty Modeling (Use) and Intent to Use Computer-
based Technology (Use)
Analysis of the best fitting line when data were entered graphically showed that as subjects’ score on
overall use of Preservice Teachers’ Perceptions of Faculty Modeling Survey increased, their score on overall use of
Intent to Use Computer-based Technology Survey also increased (Figure 2).
Relation between Preservice Teachers’ Perceptions of Faculty Modeling (Teacher-centered Use) and Intent to Use
Computer-based Technology (Teacher-centered Use)
Analysis of data showed that overall scores on teacher-centered use of Preservice Teachers’ Perception of
257
Faculty Modeling of Computer-based Technology Survey significantly predicted subject’s overall score on teacher-
centered use of Intent to Use Computer-based Technology (Table 4).
Table 4 Relation between Preservice Teachers’ Perceptions of Faculty Modeling (Teacher-centered Use) and Intent
to Use Computer-based Technology (Teacher-centered Use)
Analysis of the best fitting line when data were entered graphically showed that as subjects’ score on
overall teacher-centered use of Preservice Teachers’ Perceptions of Faculty Modeling Survey increased, their score
on overall teacher-centered use of Intent to Use Computer-based Technology Survey also increased (Figure 3).
Relation between Preservice Teachers’ Perceptions of Faculty Modeling (Student-centered Use) and Intent to Use
Computer-based Technology (Student-centered Use)
Analysis of data showed that overall scores on student-centered use of Preservice Teachers’ Perception of
Faculty Modeling of Computer-based Technology Survey significantly predicted subject’s overall score on student-
centered use of Intent to Use Computer-based Technology (Table 5).
Table 5 Relation between Preservice Teachers’ Perceptions of Faculty Modeling (Student-centered Use) and Intent
to Use Computer-based Technology (Student-centered Use)
Analysis of the best fitting line when data were entered graphically showed that as subjects’ score on
overall student-centered use of Preservice Teachers’ Perceptions of Faculty Modeling Survey increased, score on
overall student-centered use of Intent to Use Computer-based Technology Survey also increased (Figure 4).
Relation between Preservice Teachers’ Perceptions of Faculty Modeling (Role) and Intent to Use Computer-based
Technology (Role)
Analysis of data showed that Preservice Teachers’ Perception of Faculty Modeling on role of computer-
based technology for delivering course information does not significantly predicted subject’s score on Intent to Use
of Computer-based Technology based on its role for delivering course information, results were similar for both
teacher-centered role and student-centered role.
Discussion
Students today have experienced much technological advancement and are accustomed to the visual
stimulation of television, computers, and video games. Hence, they expect technology to be used effectively as part
of their learning experience. Many studies have shown that using computer-based technology in educational settings
helps students in their learning (Sahin, 2003; Stinson, 2003; Whetstone, & Carr-Chellman, 2001). So it is pertinent
for preservice teachers to effectively learn integration of computer-based technology in real life teaching scenario.
So how can they experience such learning process in their training? This study analyzes the relationship of
preservice teachers’ perception of faculty modeling of computer-based technology with their intent to use computer-
based technology when they become teachers.
Conclusion
Inspired by various researchers that shows teaching with technology provide more benefits for both
258
teachers and students than teaching without any technology (Sahin, 2003; Stinson, 2003; Whetstone, & Carr-
Chellman, 2001) and that teachers can improve their instruction by using a variety of technology resources such as
the Internet, multimedia CD-ROMs, audio and graphics (Jao, 2001); this research explores the relationship of
preservice teachers’ perceptions of faculty modeling in computer-based technology use with their intent of using
computer-based technology in educational settings.
Universities and colleges are the places to train preservice teachers to comprehensively integrate
instructional technology into their future classroom instruction. This research, based on Bandura’s (1977) social
learning theory, hypothesized that preservice teachers’ perceptions of faculty modeling in computer-based
technology use will affect preservice teachers’ intent of using computer-based technology in educational settings
when they become teachers in the future.
The results showed that preservice teachers’ perception of faculty modeling of computer-based technology
significantly affected their intent to use computer-based technology; results were similar for the use dimension and
its sub dimensions, but on the dimension of role and its sub dimensions the interaction was insignificant.
Limitations
This study has some limitations as follows:
First, the sample in this study is limited to one specific course and specific university. Hence, these results
can not be generalized as the sample is not representative. Second, there is limited research on the relationship
between Preservice Teachers’ Perceptions of Faculty Modeling, gender, age, major and Intent to Use Computer-
based Technology. However, there is research in the literature that examines each of these five variables
individually. So it is difficult to evaluate the results of this research in light of this earlier research.
The participants of this study are enrolled in other courses simultaneously, so modeling by faculty of those
courses may influence their scores on Preservice Teachers’ Perceptions of Faculty Modeling and Intent to Use
Computer-based Technology surveys also their previous experience with the use of computer-based technology may
also influence their scores on those surveys.
In future researchers may improve and add to the results of this research by taking a more representative
sample and conducting the research in a more controlled setting.
Implications
Over the course of the last decade technology has been gaining more importance in teacher education
programs but most programs still have a way to go before they can accurately prepare their graduates to use
technology to its fullest potential in their teaching and administrative activities (Moore, Knuth, Borse, & Mitchell,
1999). This research shows the importance of preservice teachers’ perceptions of faculty modeling of computer-
based technology in influencing their intent to use computer-based technology. This study is significant since
college-level instructors must be competent users of computer-based technologies in order to influence the full
development of preservice teachers who use them as role models. So the assessment of competencies of preservice
teachers’ instructors should be authentic and indicate whether the competencies instructor’s posses are adequate to
support the vision of learning in actual classroom settings.
259
Figures
Figure 1. Relation between Preservice Teachers’ Perceptions of Faculty Modeling and Intent to Use Computer-based
Technology
Figure 2. Relation between Preservice Teachers’ Perceptions of Faculty Modeling (Use) and Intent to Use
Computer-based Technology (Use)
260
Figure 3. Relation between Preservice Teachers’ Perceptions of Faculty Modeling (Teacher-centered Use) and Intent
to Use Computer-based Technology (Teacher-centered Use)
Figure 4. Relation between Preservice Teachers’ Perceptions of Faculty Modeling (Student-centered Use) and Intent
to Use Computer-based Technology (Student-centered Use)
261
References
Bandura, A. (1977). Social learning theory. New York: General Learning Press.
Bichelmeyer, B. A., Reinhart, J. M., & Monson, J. (1998, February). Teachers’ perceptions of teacher role in the
information age classroom. Paper presented at the 1998 National Convention of the Association for
Educational Communications and Technology, St. Louis, MO.
Carlson, R. D., & Gooden, J. S. (1999). Are teacher preparation programs modeling technology use for pre-service
teachers? ERS Spectrum, 17(3), 11-15.
Cassady, J., & Pavlechko, G. (2000). Does technology make a difference in preservice teacher education? Paper
presented at the International Conference on Learning with Technology, Philadelphia.
Downes, T. (1993). Student-teachers’ experiences in using computers during teaching practice. Journal of Computer
Assisted Learning, 9, 17-33.
Duhaney, D. C. (2001). Teacher education: preparing teacher to integrate technology. International Journal of
Instructional Media, 28(1), 23-30.
Faison, C. L. (1996). Modeling instructional technology use in teacher preparation: why we can’t wait. Educational
Technology, 36(5), 57-59.
Frey, B. A., & Birnbaum, D. J. (2002). Learners’ perceptions on the value of powerpoint in lectures. PA, U.S.
Hancock, V., & Betts, F. (1994). From the lagging to the leading edge, Educational Leadership, 51(7), 24-29.
International Society of Technology Education (ISTE) Accreditation Committee (1992). Curriculum Guidelines for
Accreditation of Education Computing and Technology Programs. Eugene, OR: ISTE.
International Society of Technology Education Natioinal Educational Technology Standards (2000). Retrieved
August 24, 2004, from http://cnets.iste.org/teachers/t_profiles.html
International Reading Association (2002). Retrieved January 5, 2005, from http://www.reading.org
Jao, F. (2001). An investigation of preservice teachers’ attitudes and confidence levels toward educational
technology standards and selected instructional software applications, The University of Toledo.
Koohang, A. A. (1989). A study of attitudes toward computers: Anxiety confidence, liking, and perception of
usefulness. Journal of Research on Computing in Education, 22(2), 137-150.
Kruger, K., Hansen, L., & Smaldino, S. E. (2000). Preservice teacher technology competencies. TechTrends, 44(3),
47-50.
Kurubacak, G., & Baptiste, H. P. (2002). Creating a virtual community with PT3: College of education students’
beliefs, expectations and attitudes toward online learning. World Conference on Educational Multimedia,
Hypermedia & Telecommunications. Denver, CO.
Laffey, M., & Musser, D. (1998). Attitudes of preservice teachers about using technology in teaching, Journal of
Technology and Teacher Education, 6(4), 223-241.
Lee, L. S. (1996). Problem-solving as intent and content of technology education. Paper presented at the Annual
Meeting of the International Technology Education Association, Phoenix, AZ.
Lever-Duffy, J., McDonald, J. B., & Mizell, A. P. (2005). Teaching and learning with technology, (2nd ed.). Pearson
Education, Inc., Boston, MA.
Luke, N., Moore, J. L., & Sawyer, S. B. (1998). Authentic approaches to encourage technology-using teachers.
Paper presented at the Site ’98: Society for Information Technology & Teacher Education International
Conference, Washington D. C.
Mills, S. C., & Tincher, R. C. (2002). Be the technology: Redefining technology integration in classrooms. Paper
presented at the National Educational Computing Conference, San Antonio, TX.
Moore, J., Knuth, R., Borse, J., & Mitchell M. (1999). Teacher technology competencies: Early indicators and
benchmarks. Paper presented at Society for Information Technology & Teacher Education International
Conference, San Antonio, TX.
Mower-Popiel, E., Pollard, C., & Pollard, R. (1994). An analysis of the perceptions of preservice teachers toward
technology and its use in the classroom. Journal of Instructional Psychology, 21(2), 131-138.
National Center for Education Statistics (2000). Teachers’ tools for the 21st century: A report on teachers’ use of
technology. Washington, dc: U.S. Department of Education.
National Council for the Accreditation of Teacher Education (2004), Technology and the new professional teacher:
Preparing for the 21st century classroom. Retrieved December 15, 2004, from
http://www.ncate.org/projects/tech/TECH.HTM
Nelson, E. (2004). Faculty technology survey. Retrieved November 9, 2004, from
http://www.csufresno.edu/ait/03fac-report.htm
Norby, R. F. (2002). A study of changes in attitude towards science in a technology based k-8 preservice preparation
science classroom. Paper presented at the Annual Meeting of the National Association for Research in
262
Science Teaching, New Orleans, LA.
Norum, K., Grabinger, R. S., & Duffield, A. J. (1999). Healing the universe is an inside job: Teachers’ views on
integrating technology, Journal of Technology and Teacher Education, 7(3), 187-203.
Okinaka, R. (1992). The factors that affect teacher attitude towards computer use. (Teaching and Teacher Education,
No. SP033765). CA, U.S.
Persichitte, K. A., Caffarella, E. P., & Tharp, D. D. (1999). Technology integration in teacher preparation: A
qualitative research study. Journal of Technology and Teacher Education, 7(3), 219-233.
Rezabek, L. L. (1987). Perceived credibility of female peer talent in the context of computer instruction.
Unpublished doctoral dissertation, University of Oklahoma, Norman.
Ross, E. M., Ertmer, P. A., & Johnson, T. E. (2001). Technology integration and innovative teaching practices: A
staff development model for facilitating change. Paper presented at the National Convention of the
Association for Educational Communications and Technology, Atlanta, GA.
Rutledge, K. (2000). Social learning theory. Ormond’s Psychology of Learning. Retrieved September 7, 2004, from
http://teachnet.edb.utexas.edu/~lynda_abbott/Social.html
Sahin, T. Y. (2003). Student teachers’ perceptions of instructional technology: Developing materials based on a
constructivist approach. British Journal of Educational Technology, 34(1), 67-74.
Savenye, W. (1992). Effects of an educational computing course on preservice teachers’ attitudes and anxiety
toward computers. Journal of Computing in Childhood Education, 3(1), 31-41.
Schrum. L., & Dehoney, J. (1998). Meeting the future: A teacher education program joins the information age.
Journal of Technology and Teacher Education, 6(1), 23-27.
Simmons, M. P., & Macchia, P. J. (2003). Strategies for modeling technology integration. Kappa Delta Pi Record,
39(3), 136-39.
Stetson, R. H., & Bagwell, T. (1999). Technology and teacher preparation: an oxymoron? Journal of Technology
and Teacher Education, 7(2), 145-152.
Stinson, A. D. (2003). Encouraging the use of technology in the classroom: The webquest connection. Reading
Online, 6(7).
Troutman, A. P. (1991). Attitudes toward personal and school use of computers. Paper presented at the Annual
Conference of the Eastern Educational Research Association, Boston, MA.
U.S. Department of Education. (1996). Getting America’s students ready for the 21st century: Meeting the
technology literacy challenge. Retrieved December 15, 2004, from
http://www.zuni.k12.nm.us/Ias/Tech/TLC/TOC.htm
Wang, Y. (2001). Student teachers’ perceptions and practice of the teachers’ role when teaching with computers.
Journal of Educational Computing Research, 24(4), 419-434
Wetzel, K., Zambo, R., & Buss, R. (1996). Innovations in integrating technology into student teaching experiences.
Journal of Research on Computing in Education, 29(Winter), 196-214.
Whetstone, L., & Carr-Chellman, A. A. (2001). Preparing preservice teachers to use technology: Survey results.
TechTrends, 46(4), 11-17.
Yidirim, S. (2000). Effects of an educational computing course on preservice and inservice teachers: A discussion
and analysis of attitudes and use. Journal of Research on Computing in Education, 32(4), 479-495.
263
Exploring the Vision of Technology Integration Research: Scholars’ Thoughts
on Definitions, Theories, and Methodologies
Tiffany A. Koszalka
Kerstin Mukerji
Syracuse University
Abstract
Scholarship in educational technology integration is diverse in definition, theory, and methodology. This
research is rich, complex, and contradictory. As a whole, the reliability, validity, and usefulness of such scholarship
is questionable. A panel of educational technology scholars will share insights and answer questions on this
research and how AECT may help address such issues. It is our hope that this dialogue will help inform future
directions of scholarship and practice in educational technology integration.
Introduction
"The use of technology for teaching and learning has evolved …we are just beginning
to comprehend the potential for technology to help students construct meaning for
themselves …and learn in multiple modalities and across multiple domains…” (Mills
& Tichner, 2003, p. 382)
The proliferation and diverse trends of educational technology integration in our society presents many
research opportunities for instructional designers. As a result, the scholarship on educational technology integration
is abundant in both research and practice literature. Such literature represents research conducted on multiple topics,
addressing a variety of questions in many different contexts, and using a variety of theoretical frameworks and
research methodologies. The field of educational and instructional technology research and development is, as
Richey (2000) describes, a “complex enterprise with a more complex knowledge base (p.16).” Although rich, the
abundance of scholarship in technology integration makes it challenging to interpret findings that are often
conflicting and difficult, at best, to compare. A clearer understanding of, and direction for, technology integration
research as it applies to instructional design, implementation and evaluation is required.
264
Theoretical frameworks and research methodologies
Past, ongoing, and proposed studies in technology integration are based in many different theoretical
frameworks ranging from educational technologies within the context of cognitive development processes to more
holistic investigation of relationships among the elements present in technology integration activities (Blanton et al.,
2001; Choi & Jonassen, 2000; Hill & Hannafin, 1997; Jonassen & Rohrer-Murphy, 1999; Koszalka & Grabowski,
2003; Koszalka & Wu, 2004; Peal & Wilson, 2001). Researchers have also explored technology integration using
change and adoption of innovation models (Ely, 1990; Rogers, 1995). Research methodologies being employed
include qualitative, quantitative, or mixed methods using quasi-experimental, naturalistic observation, case studies,
meta-analyses or other protocols to gather data (Koszalka & Grabowski, 2003; Reigeluth, 2003). Such research
often yields recommendations for technology interventions and instructional design strategies for integrating such
technologies to enhance teaching and learning. Yet, the findings, outcomes, and recommendations from such
studies, although insightful and beneficial, are often complex and conflicting.
Initial responses from panel members to the definition question range in scope from views of technology
integration as (i) a combination of understandings, behaviors, and attitudes surrounding what technologies do well
and how technologies enhance teaching and learning to (ii) the abilities to combine people, processes, and devices to
(iii) creating effective solutions to learning and instructional problems to processes and (iv) the results of using
technologies to support work that is done in the classroom. Other definitions suggest that technology integration is a
measure of human and tools interaction within relevant instructional contexts, e.g., the dynamic relationships among
subjects, objectives, community members, and tools to accomplish teaching or learning goals. Still other definitions
include process characterizations that address access, training, and preparation to use technologies.
Theoretical frameworks and critical research questions are just as varied as the definitions of technology
integration. Common among the frameworks are change, adoption, stages, and systems theories. Others mentioned
by panelists include behavioral and constructivists approaches to teaching and learning. Some frameworks maintain
specific and narrow theoretical foci, while others are holist and systemic in their approach. These frameworks spawn
questions that aim at investigating the (i) characteristics, affordances, and flexibility of specific technologies; (ii)
effects of technologies on teaching and learning processes or stakeholders; (iii) differences in uses of technologies in
varying instructional settings; (iv) differences in the use of technologies by different stakeholders; (v) abilities of
technologies to facilitate learner’s attainments, support learners’ learning decision making, foster student learning,
and assess the level of student learning; and (vi) examination of how technologies can mediate, afford or disturb
learning within a formal and informal learning context. And, there are many other questions currently under
investigation or recommended.
Such diverse frameworks and questions command diverse research methodologies, measures, and
instruments. Panelists suggest both quantitative and qualitative measures are necessary. Specific research methods
indicated by the panelists follow Concerns-Based Adoption, Diffusions of Innovations, design-based research
(formative research), ethnography, and community narratives methodologies. Cultural Historical Activity Theory
(CHAT) research methodologies are also called for that focus on the interaction of human activity and human
thought within its relevant environmental context. Since it is assumed that learning is not a precursor to, rather it
emerges from activity, research this research approach examines the individuals(s) involved in the activity and
265
activity elements such as the product of the activity, mediating tools, community members, and guiding rules while
the individual(s) is/are acting on and attempting to produce an outcome. Some panelists also call for traditional
quasi-experimental pre- and post- tests designs as well as survey (cross-sectional) and case study research. All these
methodologies can provide rich data and encourage deeper understanding of technology integration in context.
Comprehensive meta-analysis that include summaries and more global interpretations these studies may also provide
stronger insights needed to better understand the diverse data and results to better inform the scholarships and
practices of technology integration.
AECT can play a significant role in helping our field, and sister fields (e.g., education, communication,
information technologies, anthropology, sociology, etc.), better understand and practice in the realm of technology
integration. It is suggested that AECT members rally to provide clearer and accepted definitions of technology
integration and related terminology. Through its communication, publication, and outreach networks AECT can also
provide databases and summaries or current scholarship in these areas, encouraging scholars to share results,
collaborate, and communicate synchronously and asynchronously to discuss previous, current, and future research
and practice in technology integration. AECT members also have strong networks with people in our sister fields
who are also challenged by technology integration. Collaborating with scholars from outside our field can widen our
perspective and provide new and important clues about the practices of technology integration.
Through sharing of prepared responses by our panelist, followed by an open question and answer session,
we are beginning to unpack the complexities of educational technology integration scholarship. The invited guest
panelists (M.J. Bishop, Ikseon Choi, Barbara Grabowski, Tiffany Koszalka, Kay Persichitte, Charlie Reigeluth, Rita
Richey, and Brent Wilson) represent a wealth expertise and insight in current design issues, practice strategies,
theoretical and conceptual frameworks, educational technology research and design areas, research and practice
methodologies, current technology innovations, and overarching themes in the field of instructional and educational
technology. This dialogue will help to inform the future direction for research in educational technology integration
and enhance the quality of scholarship in this area.
References
Bishop, M.J. & Cates, W.M. (2001). Theoretical foundations for sound's use in multimedia instruction to enhance
learning. Educational Technology, Research and Development 49(3), 5-23.
Blanton , W., Simmons, E., & Warner, M. (2001). The fifth dimension: application of cultural-historical activity
theory, inquiry–based learning, computers, and telecommunications to change prospective teachers’
preconceptions. Journal of Educational Computing Research, 24 (4), 435-463.
Choi, I., Land, S., & Turgeon, A. (2001). Effects of on-line peer-support on learning during on-line small group
discussion. Paper presented at the National Convention of the Association for Educational
Communications and Technology, Atlanta, GA.
Choi, I. & Jonassen, D.H. (2000). Learning objectives from the perspective of the experienced cognition framework.
Educational Technology, 40(6), 36-40.
Ely, D. P. (1990). Conditions that facilitate the implementation of educational technology innovations. Journal of
Research on Computing in Education, 23(2), 298-306.
Gay, G. & Bennington, T. (1999). Reflective evaluation in a “technology textured” world: an activity theory
approach. In G. Gay & T. Bennington (Eds.), Information Technologies in evaluation: Social Moral,
Epistemological and Practical Applications, (pp. 3-21), San Francisco: Jossey-Bass Publishers.
Hill, J. & Hannafin, M. (1997). Cognitive strategies and learning from the world wide web, Educational
Technology, Research and Development. 45(4), 37-64.
Jonassen, D. & Rohrer-Murphey, L. (1999). Activity theory as framework for designing constructivist learning
environments. Educational Technology, Research and Development, 47(1), 61-79.
Koszalka, T. & Wu, C. (2004). Using cultural historical activity theory [CHAT] to analyze technology integration
efforts. AECT 2004 Conference Proceedings, Chicago , IL .
Koszalka, T. & Grabowski, B. (2003). Combining assessment and research during development of large technology
integration projects. Evaluation and Program Planning, 26, 203-213.
Koszalka, T. & Wang, X. (2002). Integrating technology into learning: a summary view of promises and problems.
Educational Technology and Society, 5(1), 179-183.
Lowell, N.O. and Persichitte, K.A. (2000). A virtual ropes course: creating online community. Asynchronous
Learning Networks (ALN) Magazine 4(1), 1-7.
Mills, S. and Tincher, R. (2003). Be the technology: A developmental model for evaluating technology integration
Journal of Research on Technology in Education, 35(3), 382-401.
Nadolski, R., Kirschner, P., Van Merrienboer, J. , & Hummel, H. (2001). A model for optimizing step size of
266
learning tasks in competency-based multimedia practicals. Educational Technology, Research and
Development 49(3), 87-104.
Peal, D., & Wilson, B. (2001). Activity theory and web-based training. In B. Khan (Ed.), Web-based training (pp.
147- 153). Englewood Cliffs NJ: Educational Technology Publications.
Richey, R. (2000). Reflections on the state of educational technology research and development: a response to
Kozma. Educational Technology, Research and Development, 48(1), 16-18.
Reigetluth, C. (2003). Knowledge building for use of the internet in education. Instructional Science, 31(4/5), 341-
346.
Wilson, B. (2004). Designing e-learning environments for flexible activity and instruction. Educational Technology,
Research and Development, 52(4), 77-85.
Wilson, B., Sherry, L., Dobrovolny, J., Batty, M., & Ryder, M. (2002). Adoption factors and processes. In H. H.
Adelsberger, B. Collis, & J. M. Pawlowski (Eds.), Handbook on information technologies for education &
training (pp. 293-307). Berlin and New York: Springer.
Acknowledgements
This work has been funded in part through NASA Cooperative Agreement NCC3-994, the "Institute for Future
Space Transport" University Research, Engineering and Technology Institute.
267
Scaffolding Reflection on Everyday Experiences: Using Digital Images as
Artifacts
Susan M. Land
Brian K. Smith
Brian Beabout
Sunghyun Park
KyoungNa Kim
The Pennsylvania State University
Recent efforts to support student reflection have used technology to help learners reflect upon and organize
ideas and make thinking more explicit and “visible” (Linn, 2000). Guzdial (1994) proposed several roles that
technology can play in providing scaffolding to learners, including helping learners to articulate what they know,
thus encouraging reflection. Strategies for supporting reflection can take on many forms and functions, some of
which include: (a) facilitating articulation by helping learners to externalize their ideas; (b) supporting explanation
and hypothesis building; and (c) structuring opportunities for learners to organize, reflect upon and revise artifacts
or products of their understanding (Land & Zembal-Saul, 2003; Quintana et al.2004; Schwarz et al., 1999).
Most of the research related to supporting learners to reflect involves students reflecting on activities that
are constrained to the classroom. That is, students are scaffolded in the inquiry processes of sense-making, process
management, and articulation and reflection during classroom experimentation (Quintana et al., 2004). However,
little work has been done to use technology to help students to reflect upon experiences in their everyday world that
can be used as a centerpiece or anchor for learning about important ideas that impact their lives. For instance,
students might learn how physics principles apply in a classroom learning environment, but neglect to make that
connection to the playground equipment they use in their everyday life. For many classroom-based concepts, it is
possible to support learners to have direct experiences with them in their everyday world. Furthermore, when
school-based concepts have implications for choices students make in their lives (e.g., biology or nutrition
concepts), this connection between everyday experiences and transfer becomes even more important. Making
connections to everyday contexts guides students to integrate schooling and life experiences and to develop
meaningful, long-lasting understandings (Brickhouse, 1994).
Although making connections to individuals’ everyday experiences sounds simple and intuitive, prior
research shows otherwise. Compared to the somewhat controlled setting of the classroom, the real world is fraught
with complexity and ambiguity. Students who are new to a domain neither have the observation skills nor the deep
understanding to be able to (a) make accurate observations in their everyday world; and (b) explain them (Bransford,
Brown, & Cocking, 2000). Furthermore, although links to everyday contexts may enhance the potential for transfer
(Brown et al., 1983), they also increase the likelihood that learners may draw upon incomplete or inaccurate
understanding which form the basis of faulty theories. It is well documented that learners often have intuitive or
everyday experiences that may be contradictory to formal, accepted explanations (Carey, 1986). Intuitive theories
that are connected to everyday experience are extremely resilient to change. Without opportunities to address these
theories directly, learners can potentially strengthen powerful generalizations that are not readily transferable.
To illustrate, Brickhouse (1994) studied how children linked concepts of light and shadows with everyday
experiences at home. Although students did make connections to everyday experiences, their observations outside
the classroom were imprecise and unpredictable. Students often inaccurately remembered events and missed
important details of their experiences. Furthermore, their partial understanding of light could not be easily
disconfirmed given the imprecise nature of their observations, so they often used them to justify a naïve theory.
Brickhouse notes: “Because their experiences with light outside the classroom were not constrained in the same way
as they were in classroom experiments, they reported observations that their developing theory could not yet
explain.” (p. 651). Given that learners have limited knowledge structures (Gick, 1986), grounding learning in
everyday contexts may be challenging, and at times, counter productive. However, relying exclusively on classroom
contexts where investigations are constrained and ambiguity reduced, may lead to understanding that is “inert’ and
limited to school contexts.
If it is commonly accepted that enhancing links to real-world experiences is an important component of
enhancing meaningfulness, then support mechanisms or “scaffolds” are needed to help learners observe, interpret,
and make connections to everyday experiences. Since it is challenging for learners to spontaneously make links to
their everyday life during learning, it might be useful to support them to bring their rich, authentic, everyday world
268
into the classroom. One way to help bring the everyday, experiential world of individuals into the classroom is by
capturing it through digital imaging (e.g, video; photography). Use of digital imagery can serve to capture specific
“acts” that students experience in their everyday world and turn them into “artifacts” for reflection. For instance,
students studying social studies can take photos of local historical sites and important people in the community as
data for reflection and discussion. As tools, those historical images might coexist with narratives that carefully
explain what learners should gain from them. As objects or data, learners are responsible for interpreting meaning
from the visual materials. Similarly, when studying geometry, students can photograph objects in nature that
illustrate geometric shapes and properties (Cavanaugh & Cavanaugh, 1999).
The purpose of this paper is to propose a design framework for capturing learners’ everyday experiences
through digital imaging and using them as “data” for reflection. Our goal is to help individuals to visualize the
connection between everyday actions, experiences, or choices with ways of conceptualizing those experiences. In
this paper, we present an example of a current project that uses digital photography to help children reflect upon
their everyday experiences and actions.
Strategy 1: Identify everyday contexts for learning that involve collecting, using, and reflecting on data
collected from everyday events.
Increased attention has been placed on the use of problem contexts that immerse learners in activities that
are rooted in “real world” practices. The framework of “anchored instruction”, for instance, emphasizes use of
highly contextual and everyday experiences to “anchor” learning, relying upon video-based stories or challenges to
represent them (see for instance, Jasper Woodbury Series [CTGV, 1992] and the STAR.legacy project [Schwartz et
al., 1999]). Stories, contexts, and problems that are rooted in everyday situations guide learners to make connections
to prior knowledge, a process that is needed to meaningfully integrate new knowledge, discover relevance and
interest in the topic, and enhance the potential for transfer (Brown, et al., 1983).
Recently, with the advent of new, mobile computing platforms, educators have been able to expand the
contexts for learning to include those that allow learners to collect data about their everyday experiences, and use
them as objects for reflection. In the BioKIDS curriculum, for instance, students explore their schoolyards collecting
269
and observing various animals in order to develop basic understandings of organisms, environments, and
interactions between the two. Similar biodiversity concepts could be studied with simulations, video clips, and other
media, but the use of local, familiar environments and animals may help students better grasp the relevance of the
biological theories.
Strategy 2: Capture everyday acts and transform them into artifacts for reflection.
In order for students to reflect on their everyday experiences, these acts must first be captured and then
displayed in a way that allows for analysis and reflection. Technology tools, particularly mobile computing tools, are
being increasingly used by individuals in their everyday lives as a seamless part of their real-life experiences. For
instance, handheld computer devices (such as Palm pilots), in conjunction with palm-enabled probeware, can be
used to allow learners to collect and analyze data from the field, which could include their own backyards, parks,
ponds, or swimming pools. Cell phones, PDAs, and similar tools have the technological capabilities for capturing
photos, videos, or other data outside formal work and educational environments.
Allowing learners to engage in rich, everyday experiences can provide learning opportunities, but the
results of the explorations need to be captured for later analysis and reflection. Captured behaviors and experiences
allow for reflection-on-action. But experience capture may also lead to reflection-in-action. Learners may need to
decide which aspects of their experiences are worth capturing, forcing them to consider their experiences through
the questions and hypotheses they are pursuing at the time. For instance, individuals who used photography to
document their health-related behaviors occasionally said that they changed their routines because they became
more conscious of them while taking pictures (Smith et al., 2005). Asking people to deliberately capture aspects of
their experiences often leads them to reflect on observed events as they unfold.
Strategy 3: Facilitate articulation and revision of understanding by helping learners to externalize and build
upon ideas.
The active, thought-demanding process of constructing and re-constructing understanding is considered
vital for meaningful learning and understanding (Perkins & Unger, 1999). In essence, understanding evolves in
response to new experiences and observations that prompt learners to re-evaluate, re-organize, or refine existing
explanations. Explanation building is a dynamic process that involves generating tentative theories or explanations
and refining them based on confirming or disconfirming evidence (Lajoie, Lavigne, Guerrera, & Munsie, 2001). It is
not presumed that this active process of explanation building can be pre-packaged to learners externally by a teacher
or other instructional materials. Rather, learners must be supported in the process of articulating, reflecting upon,
and refining meaning.
Previous research has shown that learners do not always engage the process of constructing and refining
understanding in ways anticipated. It is well documented that learners are not always adept at improving or refining
existing understanding independently (de Jong & van Joolingen, 1998). In some cases, learners may lack organized
knowledge structures to begin with, and thus attempt to build upon existing understanding that is already limited,
tacit, or naïve. Land and Hannafin (1997) found that some learners actively attempted to integrate their developing
theories with everyday knowledge, but in ways that were not systematic and thus interfered with the development of
more scientifically-valid explanations. Instead, learners misapplied everyday experiences and used them
inappropriately as evidence to confirm a naïve position.
Technology-based tools have been used to help learners articulate and organize their evolving ideas by
making learners’ thinking more explicit and “visible” (Linn, 2000). Most of these tools and projects have been
applied to classroom contexts, but the same approach is applicable to everyday contexts. The KIE project, for
instance, used the “SenseMaker” interface to scaffold learners to articulate theories about properties of light and
connect them with video- and other photographic evidence (Bell, 1998). Similarly, the “Progress Portfolio” tool is a
content-neutral software environment that allows teachers to customize their own scaffolding prompts or templates
that can be used to help learners articulate their understanding and to incorporate evidence generated from digital
cameras or micro-computer-based laboratories. Land and Zembal-Saul (2003) used Progress Portfolio to generate
different computer-based “experiment” and “explanation” pages to help learners to organize their data and to
articulate explanations to the driving question. Similarly, Animal Landlord, provides computer-based tools to
investigate digital film clips of lion hunts. Students use video as data (Smith & Reiser, 2005) to develop
explanations of animal behavior. The software provides a computer interface that guides students toward systematic
observations and analyses by making investigation tasks explicit and by constraining the order of progression
through the task. These same types of tools could be useful in contexts that involve reflecting on everyday
experiences, with student-collected data being organized into an artifact that can be reflected upon, shared with
270
others, and elaborated further.
A Case Study Illustrating Captured Everyday Experiences: Using Digital Photography to Capture Children’s
Everyday Experiences of Health Concepts
271
were relevant to addressing the driving question of, “how healthy is the food that I eat”? Students presented
calculations for each day of the number of servings for each food group (according to USDA guidelines), and
generated claims and rationales for the healthiness of their food choices. They then proposed changes they would
make to their diet, based on their total analyses.
Table 1 provides a summary of the scores for four of the students’ artifacts, based on a set of criteria
developed by the authors. Two different sets of raters scored each artifact. Any discrepancies between the raters of
assigned values were discussed and an adjudicated score was used. The criteria fell into 4 major categories: (a)
correctness of food group picture analyses; (b) correctness of calculation of recommended daily allowances; (c)
assessment of suggested changes to dietary choices, based on analyses and calculations of recommended daily
allowances; and (d) evaluation of overall assessment related to the driving question.
272
Table 1 Food template scores
Mean of Mean of
Scoring Criteria: (SDEV)
Score % score
Food Group Picture Analysis:
Identified correct food group for Breakfast (10points) 8 83% (0.2)
Identified correct food group for Lunch (10points) 8 77% (0.1)
Identified correct food group for Snack (10points) 6 64% (0.2)
Identified correct food group for Dinner (10points) 7 70% (0.2)
Identified correct portion size for Breakfast (10points) 9 90% (0.5)
Identified correct portion size for Lunch (10points) 8 77% (0.2)
Identified correct portion size for Snack (10points) 7 73% (0.3)
Identified correct portion size for Dinner (10points) 7 67% (0.4)
Accuracy of overall calculation for the day (10points) 10 97% (0.0)
Total (10points) 8 75% (0.1)
Table 1 shows that the mean total % score of students’ analyses of food groups is 75% with a SD of 0.1; The mean
correctness of calculation of recommended daily allowances is 92% with a SD of 0.8. The mean for assessment of
students’ suggested changes to dietary choices is 79% with a SD of 0.7, and the mean for overall assessment related
to the driving question is 75% with a SD of 0.7. During the rating process, it was observed that most errors in
students’ analyses occurred in situations where a food crossed many different food categories (e.g., an ice cream
cone, which has dairy, fat, and sugar) and where determination of serving sizes was complex. For instance,
sometimes students would correctly identify that the bread from a sandwich belonged in the grain food group, but
would neglect to list it as 2 servings, one for each piece of bread (instead identifying 1 serving). Overall, we
observed that students were very excited about and engaged in this activity.
Conclusions
In sum, this paper proposed a design framework for capturing learners’ everyday experiences through
digital imaging and using them as “data” for reflection. Using photographic records of daily behavior and tools to
help students to analyze them, our goal was to help individuals to visualize the connection between everyday
actions, experiences, or choices with ways of conceptualizing them. In this case, we were interested in the extent to
which use of past experiences influenced learning of formal educational (health) concepts, but also future decision
making. Our current and future research agenda centers around use of ubiquitous computing tools, like digital
cameras, hand held computers, etc, to capture everyday experiences and behaviors and use them as objects for
reflection. This line of research has relevance to both academic and non-academic worlds, as other prior research
has shown (Smith et al., 2005). One of the major implications for design and research that became obvious from
this pilot study is the question of how to help people benefit from large repositories of experiential data. This falls
on the designer to identify tools that can help to organize that data in ways that can be managed and reflected upon,
but in scalable ways. Our future goals are to develop specific technology tools to assist in that endeavor.
273
References
Bell, P. (1998, April). The knowledge integration environment: Relating debate and conceptual change through
design experiments. Paper presented at the Annual Meeting of the American Educational Research
Association. San Diego, CA.
Bransford, J., Brown, A., & Cocking, R. (Eds.). (2000). How people learn: Brain, mind, experience, and school.
Washington, DC: National Academy Press.
Brickhouse, N.W. (1994). Children’s observations, ideas, and the development of classroom theories about light.
Journal of Research in Science Teaching, 31 (6), 639-656.
Brown, A.L., Bransford, J.D., Ferrara, R.A., & Campione, J.C. (1983). Learning, remembering, and understanding.
In J. H. Flavell & E. H. Markman (Eds.) Handbook of Child Psychology, Vol. 3, Cognitive Development
(pp. 177-266). New York: Wiley.
Brown, J.S., Collins, A., & Duguid, P. (1989). Situated cognition and the culture of learning. Educational
Researcher, 18(1), 32-41.
Carey, S. (1986). Cognitive science and science education. American Psychologist, 41 (10), 1123-1130.
Cognition and Technology Group at Vanderbilt (1992). The Jasper Experiment: An exploration of issues in learning
and instructional design. Educational Technology Research and Development, 40(1), 65-80.
deJong, T., & van Joolingen, W. (1998). Scientific discovery learning with computer simulations of conceptual
domains. Review of Educational Research, 68 (2), pp. 179-201.
Gick, M. L. (1986). Problem-solving strategies. Educational Psychologist, 21, 99-120.
Guzdial, M. (1994). Software-realized scaffolding to facilitate programming for science learning. Interactive
Learning Environments, 4, 1-44.
Land, S.M., & Hannafin, M.J. (1997). Patterns of understanding with open-ended learning environments: A
qualitative study. Educational Technology Research & Development, 45(2), 47-73.
Land, S.M., & Zembal-Saul, C. (2003). Scaffolding reflection and articulation of scientific explanations in a data-
rich, project-based learning environment: An investigation of Progress Portfolio. Educational Technology
Research & Development, 51 (4), pp. 65-84.
Lajoie, S.P., Lavigne, N.C., Guerrera, C., & Munsie, S.D. (2001). Constructing knowledge in the context of
BioWorld. Instructional Science, 29 (2), 155-186.
Linn, M. (2000). Designing the Knowledge Integration Environment. International Journal of Science Education,
22 (8),781-796.
Perkins, D.N., & Unger, C. (1999). Teaching and learning for understanding. In C. Reigeluth’s (Ed.), Instructional-
design theories and models, Volume II (pp. 91-114). Mahwah, NJ: Erlbaum.
Quintana, C., Reiser, B., Davis, E., Krajcik, J., Fretz, E., Duncan, R., Kyza, E., Edelson, D., & Soloway, E. (2004).
A scaffolding design framework for software to support science inquiry. Journal of the Learning Sciences,
13 (3), 337-386.
Schön, D.A. (1983). The reflective practitioner: How professionals think in action. New York: Basic Books.
Schwartz, D., Lin, X., Brophy, S., & Bransford, J. (1999). Toward the development of flexibly adaptive
instructional designs (pp. 183-213). In C. Reigeluth (Ed.), Instructional-design theories and models: A new
paradigm of instructional theory, Volume II.
Smith, B.K, Frost, J., Albayrak, M., & Sudhakar, R. (in press). Improving diabetes self-management with
glucometers and digital photography. To appear in Personal and Ubiquitous Computing.
Smith, B.K., & Reiser, B. (1995). Explaining behavior using video for observational inquiry and theory articulation.
Journal of the Learning Sciences, 14 (3), 315-360.
274
Influence of Electronic Text Presentation Mode on Student Understanding
and Satisfaction; An Empirical Study in Korean Context
Hye-Jung Lee
Jeong-Won Woo
Seoul National University
Abstract
The purpose of this research is to identify the influence of electronic text presentation mode on student
understanding and satisfaction. For this research, text, e-text, and compression as a technology were first reviewed
theoretically. And for empirical data, four electronic instructional programs using four different text presentation
modes - full description mode (not compressed, with full sentence), compressed mode (itemized mode, table mode,
concept map mode) - were developed with same contents. From the implementation of these programs at high
school level, student achievement, reading time and recall time, satisfaction level, were collected and analyzed.
Findings indicated that compression technology showed similar, even better effect in student achievement and
satisfaction. Also compression saved reading and recall time especially for unskilled and poor student. Results and
implications were discussed and recommendations for further study were suggested.
Introduction
Text has been most typical and powerful tool to communicate knowledge and information in literate
society. We learn most of our school knowledge from the text written with letters that we’ve learned since
childhood. So it is hard to say teaching-learning without text.
Traditionally text has been shown as printed book in teaching and learning environment since print
technology was invented. But the advent of computer and network technology makes us experience another type of
text presented in computer monitor, which is called ‘electronic text’. We contact electronic text in CAI(Computer
Assisted Instruction), WBI(Web-based Instruction), and Mobile learning as well in recent digital age.
Electronic text is able to engage hypertext using link and node and to be shown with non-text materials
such as image, graphic, sound, video clip, etc., which were not available in traditional printed text. In spite of these
capabilities, electronic text is said to be less effective in reading compared with printed text. Because it is on the
screen, it makes our eyes feel tired easily and this leads to lower legibility. So it is said that people show different
pattern in reading electronic text. According to Neilsen(1997), 79% of learners tend to take skimming mainly with
highlighted headings rather than intensive careful reading when they read electronic text. And reading electronic text
is said to be about twice slower than reading printed text. In addition, people accept information 25% less in
electronic text than in printed text and they even tend to think reading electronic text as wasting of time(Rha, 1999).
To overcome these limitations, researches for better legibility has been reported. Most of researches on
electronic text have been to improve legibility, mainly focusing on font, color, arrangement, highlight, and space of
the text on screen. Another approach to improve legibility is about text density(Ipek, 1995). Text density is
dependent on the ratio of word count and space occupied by the text. What matters in text density research is how to
design text with low density without loss of contents; they need to keep quality in spite of less quantity.
Low text density without contents loss is becoming a critical issue in instructional design of electronic text.
This is the need for text compression. Kim(2000) also indicated that strategies for text presentation with
compression without contents loss should be researched in educational technology. However, it is not easily found
which type of compression is effective in learning, how learner accept contents in different presentation mode.
The purpose of this research, therefore, is to identify the influence of electronic text presentation mode on
student understanding and satisfaction. For this research, text, e-text, and compression as a technology were first
reviewed theoretically. And for empirical data, four electronic instructional programs using four different text
presentation modes - full description mode (not compressed, with full sentence), compressed mode (itemized mode,
table mode, concept map mode) - were developed with same contents. From the implementation of these programs
at high school level, student achievement, reading time and recall time, satisfaction level, were collected and
analyzed. Results and implications were discussed and recommendations for further study were suggested.
Theoretical review
Text technology
The proposition of ‘Text is a technology’, in a respect, is more concretely talking about text design
275
technology. Text design concerns about how to write text for better understanding, for more effective and
impressive delivery of message. The following is about previous literature on how to design text.
Text design In instructional text, we need to design it not just for reading, but for understanding and
learning. So text should be designed in consideration of attention, comprehension, and retention. Stewart(1998)
proposed that text design should consider topic analysis, objectives, hierarchic relationsip, inserted question, verbal
cueing, structure. Structure here included typographical cueing, verbal cueing, legibility and layout. In addition to
this explicit structure, there are researches on implicit structure. Researches on text structure indicate that there is a
hierarchy in text structure and super-structure is better to remember than sub-structure. Likewise, people are
reported to recall super-proposition better than sub-proposition(Meyer, 1975). Kintsch and Dijk(1983) similarly
persisted that macrostructure, composed by microproposition, played a critical role in overall understanding of text,
because macrostructure could be a basis of message summarization. Cognitive scientists also persist that we need
meta-cognitive strategy for abstracting, schematizing, organizing, elaborating of contents to receive, memorize, and
recall easily. So instructional text should be designed considering these factors, and how to do is the very
technology.
Literacy nature, orality strategyText is basically an artifact in literacy. It is shown, not heard, in written
language. So text has the nature of literacy. It can be compressed and structured with title, headings, summary, key
words, and chapters, which are features of literacy. But it is said that text also should have orality for better learning.
Early distance educators suggested that text should be written in ‘orally spoken language’ instead of literate written
language in distance learning material. According to Holmberg(1983), teachers must imagine that they are speaking
to someone when writing teaching texts, and this is supposed to make them use a spoken language wherever
possible. He suggested that teachers should use a ‘clear, somewhat colloquial language’, write in a personal style
and appeal to students’ emotions as well as their intellect, avoid great density of information, draw their attention to
important points. Indeed, he asserted that text in distance learning material should be ‘guided didactic conversation’.
And it is also reported that using spoken language can improve social presence in distance learning so students are
more satisfied with learning, not feeling isolated(Gunawardena & Zittle, 1997). These researches implicate the need
of careful instructional design in e-text writing to improve legibility, social presence and to decrease cognitive
overloading and feeling isolation. And they suggest orality in text as a technology to meet this need.
Electronic text
Electronic text has different characteristics from printed text. Electronic text is able to represent hypertext
using link and node and to be shown with non-text materials such as image, graphic, sound, video clip, simulation,
etc., which were not available in traditional printed text. In spite of these capabilities, electronic text is said to be less
effective in reading compared with printed text. Because it can be seen by emission of light on CRT(Cathode-Ray
Tube) screen, the resolution is not as clear as one in printed format. So it makes our eyes feel tired easily and this
leads to lower legibility. Many researches on electronic text indicate that e-text has low legibility. The legibility is
primarily dependent on visibility, recognizability, readability. And e-text is said to have limitations on visibility and
recognizability(Kim, 1991). So we need careful design strategy, considering some factors such as leading/line
spacing, heading, directive cues, line length, justification, color and font, text density, etc., to improve legibility in e-
text design.
Secondary orality?Ong(1982) indicated that electronic technology brought us into the age of ‘secondary
orality’. Orality is an untouched culture by any knowledge of writing or print(Ong, 1982). According to Ong(1982),
there are essential differences between literacy and orality, and we need to understand this different features for
better comprehension of language and communication. He took this word, ‘secondary orality’ to characterize
electronic text, because he thought our electronic age has great resemblances to primary orality. Although Ong
named it as ‘Secondary orality’(we’re not sure whether this term is really appropriate for e-text), however, electronic
text seems to have orality, literacy, and the third new trait as well.
Firstly, e-text is basically text so it has the nature of literacy. Apparently it is written on the screen, and it is rather
visual than audile because it is expressed by text not voice, so it looks like an artifact in literacy. The older
generations tend to write electronic text just like a literate text in printed format. Indeed, most of official
instructional e-texts are just like printed publication.
Nevertheless, e-text is also a lot close to orality. Computer and network technology allows us synchronous
dialogue as well as asynchronous communication so people communicate just like in oral culture even though they
use written (more exactly, typed) text. This is the orality of electronic text. In computer chatting or discussion,
276
people feel strong group sense, namely groupminded, just like in primary orality as Ong(1982) mentioned.
According to Ong(1982) and Lee & Yu(2004), literacy leads people to individual space; individual activity like
individual reading, individual reflection, individual understanding. However, orality is said to be rather social,
interactive, and interpersonal(Ong, 1982). This feature of e-text is recently elevating interests in CSCL(Computer
Supported Collaborative Learning). Orality is additive, aggregative, contingent, situational, and group-minded
according to Ong(1982). Likewise, CSCL has those features. One of the papers presented at CSCL conference
reported(Stahl, 2000) that students learn socially rather than individually in CSCL. According to Stahl(2000) and
Lee(2004a, 2004b), students’ knowledge building process include social reflection, shared understanding, and social
externalization, which were intrinsically differentiated from individual reflection, individual understanding, and
individual externalization. They assert that electronic environment is greatly appropriate for social learning and
knowledge construction. Contrary to the nature of print-‘closed’(as Ong mentioned, print confines people to
individual room), electronic text is opened to networked world. That is, electronic text is leading individual learning
with printed text to social CSCL.
However, there is some part that cannot be explained by orality and literacy considered as typical traits of
traditional language. New generation grows up with computer technology and they perceive computer
communication natural almost like mother tongue. Even though it is expressed by text on screen, they don’t write it
but type it with keyboard. Their typing is much faster than their writing. Even they think better not in writing but in
typing. They can coin new words(destroying and transforming alphabet as well as grammar, called ET(extra-
terrestrial) word in Korea) or new signs like emoticons with combination of keystroke. This keystroke influences on
our thinking process. This phenomenon from computer is going to mobile. We have 36 million registered mobile
numbers in Korea as of August, 2004. This is a formidable number, considering around 47 million people living in
South Korea now. Except very little kids and babies, almost everybody in Korea has mobile phone. Nobody asks
phone number at home or at office. We just ask mobile phone number for contact. It is not surprised that even most
of K-12 students have one. They are called rather ‘Thumb generation’ than ‘N-generation’. N-generation is for
generation with Network and computer. But in mobile age, people use only Thumb to type text message on mobile
phone, whereas they use several fingers to type on computer keyboard. Recently mobile phone in Korea is used
more often to send text messages by thumb than to call with voice. In that text shown on the screen of mobile device
as well as computer, there are many new words and new signs, which were never seen in traditional literacy or
orality. Therefore, we need to consider literacy, orality, and the third new trait(called whatever) as well in electronic
text shown on the screen of either computer or mobile device.
277
identify the influence of text compression on learning in electronic environment.
Research method
Program development
In order to investigate the influence of electronic text presentation mode on student learning, four electronic
instructional programs using four different text presentation modes were developed with same contents. The subject
was ‘General social science’ for high school student, written in Korean. All texts were typed with 12 point, black,
double space, and regular font. These format and layout was one of the most typical formats in Korea. Text
presentation modes used in this research were a full description mode and three compressed modes(see Figure 1, 2,
3, 4); a full description mode(Group 1) was not compressed, written with full sentence. And three compressed mode
included itemized mode(Group 2), table mode(Group 3), and concept map mode(Group 4), which were most
commonly used compression styles in e-text in Korea. Text was compressed around 43% ~ 47% in word count, but
no contents loss. Three experts in instructional design practice verified the text material on and off throughout the
development process.
Learners
One hundred and thirty-nine students of tenth grade at S high school in Korea, randomly assigned to four
groups, were required to read web contents provided in this research for around 30 minutes and to take an exam after
studying it. Students were assigned randomly after pre-test. There was no significant difference between groups on
pre-test(F=.17, p> .05), so each group(N=35 for group 1, 2, 3, and N=34 for group 4) was found to be homogeneous
in terms of prior knowledge level and computer literacy. All students, like most other students in Korea, were
familiar to computer environment and electronic text.
Procedure
In this research, dependent variable was student understanding, satisfaction, and time taken for reading and
recall. Independent variable was electronic text presentation mode. Research procedure was as follows; text material
development / → pre-test → grouping students → program implementation and time measurement → achievement
test and time measurement → satisfaction survey. It was recorded in each group how long it took for reading
electronic text and for taking an exam after reading it. The exam mainly included recall of the text contents. Also
satisfaction survey as well as cognitive achievement test was carried out. Satisfaction survey consisted of 5 items
including reading usability, comprehensibility, accuracy, attraction, and preference. One-way ANOVA was
conducted to analyze data.
278
understanding. This result supports some of previous researches. According to Reder & Anderson(1980), ‘teaching
with text’ was not superior to ‘teaching with only text-summary’ in student understanding, and even teaching with
text-summary showed better retention in a while. There are also some researches saying that especially poor and
unskilled students have difficulties to grasp what is important and to reorganize the text for comprehension(Rainski,
1985; Winograd, 1984). Students with lower achievement are said to be poor at summarizing and compression of the
text. So it is suggested that poor unskilled students should get text with compressed format rather than detailed full
description format as long as there is no contents loss.
However, among compressed groups, group 2(compressed with table) showed significantly higher
achievement than group 4(compressed with concept map). This implicates that learning effect could be influenced
variously by compression strategy. In this case, ‘compressed format with table’ seemed to provide well-organized
structured text by dividing headings and contents so it could be engaged in cognitive schema more appropriately. On
the other hand, compressed format with concept map was not found to be good for learning in this research. This
seems to be because of the contents exposure time. Students only could see contents when they click hyperlinked
key concepts in this case, otherwise they couldn’t see the contents except their headings. Also if they clicked another
hyperlinked word, the previous contents disappeared in this case. This design style could lead to short exposure time
of contents resulting in lower achievement. This case implicates that we need to be more cautious in text
compression using concept map.
Consequently, strategy for text compression helps text information process in learning. Compressed e-text
rather than non-compressed full description text shows similar, even better effect in learning, and this compression
strategy could be a good way to overcome limitations of electronic text.
Table 2. Results of one-way ANOVA; e-text presentation mode and student achievement
Sum of Mean
df F Sig.
Squares Square
Between groups 119.85 3 39.95 2.89* .026
Within groups 1864.50 135 13.81
Total 1984.35
(*p< .05)
279
Influence of electronic text presentation mode on reading time and recall time
In regard of time taken, it was found that there was no significant difference in reading time and recall time
between groups with different electronic text presentation modes. However, there was a difference in time range;
from the shortest time to the longest time. The time range in group 2(compressed with table), group 4(compressed
with concept map) was 13 minutes. And it was 15 min. in group 3(compressed with item) and was 20 min. in group
1(non-compressed mode) as shown in Table 4. Indeed, the time deviation was decreased in compressed mode than
in non-compressed full description. And the distribution diagram said that unskilled poor student, rather than good
skilled student, showed wide deviation from compression mode to non-compression mode. This means that either
expert students or poor unskilled students take similar time in reading compressed text, whereas unskilled students
would take more time than expert students in reading non-compressed full description. This implicates that
compressed text is especially helpful for poor and unskilled student. This is the result supporting some previous
researches(Rainski, 1985; Winograd, 1984). According to Rainski(1985) and Winograd(1984), poor and unskilled
students tend to depend on a certain part or a certain information which is related to personal interest. And they are
poor at grasp of what is important and what is not. Moreover, it takes longer for poor students to understand and
grasp as good students do. So text compression is useful strategy especially for poor and unskilled student. Because
compressed text saves time to search topic, grasp the meaning, and its structure is corresponding to comprehension
schema so it is helpful for understanding, memorizing and retention.
Meanwhile, there was also no significant difference in average recall time(time taken for exam) between
groups. But like reading time range, time range (from the shortest time to the longest time) showed differences in
recall time. In regard of recall time range, group 3(compressed with item) took 10 min., group 4(compressed with
concept map) took 11 min., group 2(compressed with table) took 17 min., and group 1(non-compressed full
description) took 20 min (See Table 5). This indicated that compressed text shortened recall time of poor students.
This can be explained in relation to cognitive science. According to information processing theory in cognitive
psychology, people use ‘long-term memory’ to store and to recollect information. To recollect information, it is
important how to organize information systematically for storage in long term memory. Kim(1991) suggested that
the process of recall should be similar to the process of storage for easy recollection. So we suppose that text
compression way in this research could be similar to the process of recall so it is helpful to recall rather easily.
Besides, it could be associated with externalization in learning process. Compressed mode with item was just like
note-taking format. Note-taking of students is activity for externalization of what they understand. According to
Lee(2004b) and Lee & Yu(2004), externalization is one of the critical steps in learning process. Through
externalization of what they understand from individual reflection, knowledge is eventually internalized to personal
comprehension schema. So it can be supposed that if a text is presented in externalization way, it could be received
more easily.
280
groups with different electronic text presentation modes as seen in Table 6 and Table 7. Compressed mode showed
higher satisfaction level than non-compression mode (F=3.74, p< .05). That is, students were not satisfied with full
description text on the screen. They said that it was hard to concentrate on the text that was just like printed format
but only shown on the screen. It made our eyes tired easily. Consequently, text presentation mode influences on
student satisfaction as well as cognitive achievement.
Table 7. Results of one-way ANOVA; e-text presentation mode and satisfaction level
Sum of Mean
df F Sig.
Squares Square
Between groups 130.57 3 43.52 3.74* .012
Within groups 1533.06 132 11.61
Total 1663.63
(*p< .05)
281
5. Learner’s own compression strategy as well as instructional strategy can be a good theme to understand
meta-cognitive area more comprehensively.
6. The text used in this research was written in Korean. It can be applied to English as well. And its
comparative study(e-text compression effect between Korean and English) could be an interesting topic.
This comparative study will tell us whether the results in this research can be applied to other language in
general or just language specific.
7. Also we can compare electronic text with printed text on compression effect. Then we can find whether the
results mainly depend on media technology, or writing technology. Compression is kind of writing
technology and electronic/print is media technology. If we get similar results in same experiment but with
printed text, we shall conclude that compression strategy(writing technology) can be applied same either in
printed text or e-text. And the reason of the compression effect can be explained not because reading e-text
is not easy as reading printed text so the compression is helpful to read, but possibly because compression
strategy is more acceptable to cognitive schema whatever the media technology is. However, if we get
different results, we shall suggest the results in this research are electronic media specific.
References
Brown, A. L., & Day, J. D. (1983). Macrorules for summarizing texts: the Development of expertise, Journal of
Verbal Learning and Verbal Behavior, 22.
Gunawardena, C.N. & Zittle, F.J. (1997). Social presence as a predictor of satisfaction within a computer-mediated
conferencing environment. American Journal of Distance Education, 11(3), 8-26.
Holmberg(1983). Guided didactic conversation in distance education. In D. Sewart, D. Keegan & B. Holmberg
(Eds.), Distance education: International perspectives. (pp.114-122). London: Routledge.
Ipek, I.(1995). Considerations for CBI Screen Design with Respect to Text Density Levels in Content Learning from
an Integrated Perspective. Imagery and Visual Literacy: Selected Readings from the Annual Conference of
the International Visual Literacy Association (26th, Tempe, Arizona, October 12-16, 1994).
Kim, S. H. (2000). A study on the teaching of electronic reading. Master thesis. Seoul National University. Korea.
Kim, Y. S. (1991). A study on electronic text development using text information process strategy. Journal of
Educational Technology, 7(1), 53-75.
Kintsch, W. & van Dijk, T. A.(1978). Toward a model of text comprehension and production. Psychological
Review, 85, 363-394.
Kintsch, W. & van Dijk, T. A.(1983). Strategies of discourse comprehension. NY: Academic Press.
Lee, H. J. (2004a) Learning process mechanisms in resource-based structured instruction and interpersonal
interactive instruction in web-based distance learning environment. Doctoral dissertation, Seoul National
University, Korea.
Lee, H. J. (2004b) A model of knowledge building process in asynchronous discussion learning environment.
Journal of Educational Technology, 20(1), 117-140.
Lee, H. J. & Yu, B. M. (2004) Learning process in resource-based well structured instruction in web-based distance
learning environment. Proceeding in 2004 AECT(Association for Educational Communications and
Technology)’s Annual International Convention(Chicago, USA, Oct.).
Meyer, B. J. F. (1985). Signaling the structure of text, in Jonassen, D. H.(ed.), The Technology of Text, Vol. 2,
Educational Technology Publications, 64-89.
Neilson, J. (1997). How users read on the web. Retrieved from http://useit.com/alertbox/9710a.html
Ong, W. J. (1982, 2002). Orality and Literacy; The technologizing of the word. NY: Routledge.
Rasinski, T. V.(1985). Readers identification of levels of importance expository texts: A exploratory study, Paper
presented annual meeting of the National Reading Conference (35th, SaCA).
Reder, L. & Anderson, J.(1980). A Comparison of texts and their sumemorial consequence, Journal of Verbal
Learning and Verbal Behavior, 19, 121-134.
Rha, I.(1999). Web-based Instruction.
Stahl, G. (2000) A Model of collaborative knowledge-building. In B. Fishman & S. O’Connor-Divelbiss (Eds.),
Fourth International Conference of the Learning Sciences (70-77), Malwah, NJ; Erlbaum.
Stewart, A.(1988). Towards a theory of instructional text, Paper presented at the annual convention of the
association for Educational Communication and Technology, New Orleans, January.
Winograd, T.(1972). Understanding natural language. NY: Academic Press.
282
Appendix
283
Figure 2. Compressed mode with item
284
Figure 3. Compressed mode with table
285
Learning to Collaborate, Collaboratively: An Online Community Building
and Knowledge Construction Approach to Teaching Computer Supported
Collaborative Work at an Australian University
Mark J.W. Lee
Ken Eustace
Lyn Hay
Geoff Fellows
Charles Sturt University
AUSTRALIA
Abstract
The subject Computer Supported Collaborative Work (CSCW) immerses students into social, philosophical
and psychological aspects of working online, and the technology issues associated with being an online workgroup
participant. This paper describes, in the context of relevant literature, the teaching, learning and assessment
strategies, as well as the open source groupware framework used to build a successful online community for
collaborative learning and knowledge construction amongst students of diverse backgrounds and interests,
separated by the barriers of time and distance. Student evaluation results and future plans are also discussed.
Introduction
Each of the authors has been involved in team teaching a subject called Computer Supported Collaborative
Work (CSCW) at Charles Sturt University. The subject introduces students to contemporary social and technology
issues as participants in online communities. Its enrolment comprises a wide array of undergraduate and
postgraduate students, studying both on-campus and via distance education, throughout Australia and overseas,
hailing from a diverse range of disciplines. It provides a focus for discussion and application of CSCW in fields such
as professional development, information technology, library science, education, teacher librarianship, health care or
policing. The four major outcomes of this subject are:
1. to understand the need for a multidisciplinary approach to learning and workflow within online
communities;
2. to work effectively within a collaborative community;
3. to understand through negotiation the issues linked to computer supported collaborative work
(CSCW); and
4. to demonstrate an understanding of the processes required to design, build, use and evaluate online
communities using groupware tools.
Students explore various cognitive frameworks used in CSCW, and learn how to select and tailor a
framework appropriate to specific collaborative situations and tasks. They study the principles underpinning the
design and building of workgroup specific infrastructures to support successful workflow and human interaction. A
mandatory component of the subject requires students to collaborate regularly with others using a variety of
software – By integrating literature and other subject content about CSCW, students and instructors employ
information environments and groupware tools such as e-mail, forums, Z Object Publishing Environment (Zope),
Yahoo! Groups, weblogs (blogs) and MOO to facilitate collaborative learning and knowledge construction, and to
capture artefacts resulting from these processes.
In addition, CSCW has a broader, underpinning aim of helping to nurture community-minded individuals,
consistent with the views expressed by Peck (1987):
We human beings have often been referred to as social animals. But we are not yet community creatures. We are impelled to relate
with each other for our survival. But we do not yet relate with the inclusivity, realism, self-awareness, vulnerability, commitment,
openness, freedom, equality, and love of genuine community. It is clearly no longer enough to be simply social animals, babbling
together at cocktail parties and brawling with each other in business and over boundaries. It is our task – our essential, central, crucial
task – to transform ourselves from mere social creatures into community creatures. It is the only way that human evolution will be
able to proceed. (p. 165)
Rationale
The subject was first initiated when Eustace and Hay (2000) reflected on their own discourse as University
teachers and researchers, in which they and their students were expected to use a myriad of Internet services and
tools to communicate and share data. They thought it timely to develop a subject to teach both about and using such
286
tools to help professional workgroups operate effectively online, based on a community building approach or theme.
Since its genesis, the subject has evolved at the hands of other academics at CSU, including Mark Lee and
Geoff Fellows, and through the active participation and contribution of a number of student cohorts. This paper
describes how the above objectives were achieved in the subject’s Spring (July-November) offering in 2004, in
addition to outlining plans for refinement and improvement in future iterations.
Groupware framework
The CSCW groupware framework is centred around five main tools, as illustrated in Fig. 2.
The CSU Forums are asynchronous, Web-based, threaded discussion boards. The system used is one that
was developed in-house by the University’s Division of Information Technology.
Z Object Publishing Environment (Zope) (see Zope Corporation, 2005b) is an object-oriented web
application development and publishing system, written in the Python programming language (Fig. 3). It is free and
open source, and available for multiple operating systems. Though functionality of the system can be dramatically
extended through the use of Python scripts, a number of sophisticated server-side tasks can be accomplished with
little or no programming knowledge, thanks to Zope’s Document Template Markup Language (DTML). The content
on a Zope server can be managed via a web browser or through WebDAV (Whitehead, 2005), the latter of which
allows files to be uploaded directly from within supporting software. For example, Microsoft Office documents can
be saved on Zope via WebDAV, as if it were simply another folder on the local network.
Multi-User Dungeon, Object-Oriented (MOO) was used as the vehicle for delivering synchronous online
classes (Fig. 4). Specifically, the enCore system developed by Holmevik & Haynes (2004) was used. A MOO
server and object-oriented core database, is a network-accessible, multi-user, programmable, interactive system
originally designed for the construction of text-based adventure games, conferencing systems, and other
collaborative software. Participants (usually called players) have the appearance of being situated in an artificially-
287
constructed place (social space) that also contains those other players who are connected at the same time. MOO
facilitates polysynchronous communications, that is it allows for a hybrid communication model comprising both
synchronous and asynchronous elements. For example, players can interact and chat in real-time when they are
logged in to the MOO simultaneously. In addition, their actions can impact and have a lasting effect on the state of
the objects in the MOO, even after they have logged out – Notes can be left on notice boards and signs erected
which will allow messages to be left behind for other players; objects such as furniture and office equipment (eg.
whiteboards, slide projectors) can be created, used, moved and otherwise manipulated; etc.
Yahoo! Groups (Yahoo! Inc., 2005) is a free, web-based service with which students are able to set up and
manage their own discussion groups. Yahoo! Groups works on a “push” based model in which postings to the group
are automatically sent to each member’s e-mail address, by default. In addition, a rich set of ancillary services are
included, such as synchronous chat facilities, file and photo sharing repositories, shared databases and calendars
(Fig. 5).
COREBlog (Central Core, 2004) is a Zope-based, open source web logging (blogging) system. Although
originally intended to allow individuals to maintain their own personal journals and make these available for public
viewing, blogs have found numerous applications in educational spheres. The easy-to-use nature and informal,
journal entry style have lent themselves to ready adoption by instructors, who create blogs for purposes ranging
from providing content, commentary and study hints, to disseminating subject-related announcements. Learners, too,
benefit from creating their own blogs, be they for use as online learning portfolios and reflective journals, or simply
as “soapboxes” for personal self-expression. Shared or group blogs also exist, which can serve as a powerful
collaboration and shared publishing tool (Fig. 6).
The abovementioned tools are supplemented with regular e-mail contact between students and instructors,
as well as amongst the students themselves. Furthermore, students are encouraged to investigate and explore various
alternative tools to add to their groupware “toolkit”. In fact, students were required to develop their own, personal
taxonomy which to classify and evaluate groupware tools as they encountered them throughout the semester.
Subject content
In Spring 2004, the subject content consisted of five core topics:
1. Underlying principles of an online community: The CSCW framework
2. How to create online communities: Workgroups and collaborative styles
3. CSCW citizenship: Belonging to an online learning community
4. Supportive tools for CSCW
5. Case studies in CSCW
Students were provided with an online schedule of commentary, readings and exercises for each topic on
the subject Zope site (CSCW/online communities groupspace, 2004). Exercises in the topic schedule were marked
with an [OLR] tag to notify students that evidence of completion of the task was to be published on Zope in the
student’s personal folder, which contributed to his/her own Online Learning Record (Fig. 7).
288
Internet connections.
A seminar/workshop style was adopted for the MOO sessions in the earlier weeks of the semester. These
covered general orientation to the subject and its groupware framework, in particular basic MOO training and
familiarisation with Zope. Many online instructors find that interactivity is preferred over the one-way information
flow of lectures. As such, in later weeks, MOO time was dedicated to open discussions and debates on topics related
to the subject content, including contemporary CSCW and groupware issues.
Logs of all sessions were saved in the form of log objects in the MOO. Since the web-based representations
of MOO objects are accessible via URLs (Fig. 8), hyperlinks to the logs and lesson slides (contained slide projector
objects) were able to be placed on Zope for easy access.
Learning to use and program a MOO, exploration, R&D and prototype development of worlds and
groupware, were also available using a second, “sandbox” MOO called K9MOO (K9 campus and theme park,
2004). Both MOOs were available throughout the session for students to hold their own meetings outside of regular
class times. Most students were able to build their own personal and group “home” rooms, as well as populating
these rooms with their own MOO objects such as log recorders, slide projectors and notice boards, to enhance the
collaboration environment. Many took advantage of the programmability of the MOO by scripting their own verbs
(methods, or operations) to add to the functionality and interactivity of their objects.
In addition to synchronous online sessions, face-to-face meetings were held for on-campus students. The
format of these meetings was largely informal and discussion-based; they simply offered opportunities for those
located at the University to convene and discuss/report on their progress in the subject. Short lectures delivered by
the instructors on alternate weeks were intended to generate discussion, the notes for which were published on Zope
for the benefit of all students.
289
or “agent”, to participate in a group using the other tool. The fifth and final activity saw the agents returning to their
original, “home” groups to report on their observations.
Many believe that assessment imposes barriers on effective discussion and the sharing of ideas in an online
learning community (eg. Chen, 2004). As it was felt that grading the POD activities might inhibit students’
willingness to express ideas openly and freely, the decision was made not to assess these activities directly. Instead,
evidence of having completed the POD activities, together with reflective comments on the experiences, were to be
incorporated into each student’s individual Online Learning Record (OLR), which accounted for a substantial
portion of the subject’s formal assessment.
In fact, instructors actively participated in PODs only where invited to do so by the members, as “guests”.
When this was the case, the guests were to be told the purpose of their input a given a briefing of their role. They
had to be made familiar with the guidelines regarding group processes, provided with technology support and
supplied with feedback on their performance.
Assessment strategies
There were four assessment items for this subject. All four items were compulsory, available online and
subjected to further analysis and evaluation. These are listed below in order of submission:
1. Project proposal
2. Assignment 1: Online Learning Record (OLR)
3. Assignment 2: Project report
4. Subject evaluation
The two major assignments – the OLR and project report – were formally assessed, and each carried a 50%
weighting of the student’s final grade. The project proposal and subject evaluation did not carry a weighting but
were required for successful completion of the subject.
Students were advised to read through all assessment instructions at the very beginning of the session as
involvement in online community building exercises began in the first week of session and was ongoing throughout
the semester. They were also required to work out a personal plan in preparation for the completion of weekly
readings, written exercises, practical lab activities, and collection of evidence of participation in, and evaluation of,
online community activities based on a supplied framework.
290
The actual project topic was negotiated on a one-on-one basis with a supervisor. For those having trouble
selecting a topic, a list of additional ideas was provided on Zope (Appendix B). The instructors and a number of
other academics at the School of Information Studies, CSU agreed to act as “sponsors” for students wishing to
undertake these projects.
Prior to commencing the project, students were required to complete a Project Proposal form (Appendix
D), which was reviewed by the supervisor and appropriate feedback provided via e-mail. The form was scripted in
DTML and deployed on Zope. In many cases this was an iterative process, with students refining and submitting
several versions of the proposal until both the student/group and the supervisor were satisfied and ready to move on
will the actual project execution. Continuous mentoring and feedback via e-mail continued following the approval of
the proposal, throughout the duration of the project. The supervisor was also available at the end of each scheduled
MOO session to offer additional assistance.
Students were asked to document the refinement process of their chosen topic and the subsequent
development of their project in their OLRs, but were reminded that the OLR itself was to be assessed separately
from the project. Students who worked in groups were also required to submit a Division of Work statement so that
the contribution of each member could be assessed. The assessment criteria from the marking sheet appears in
Appendix C.
Subject evaluation
The final assessment item was the completion of an online survey form evaluating the content, outcomes,
tools and processes used in the delivery of the subject through a series of open-ended questions. Like the project
proposal form, the survey form was mounted on Zope. This assessment item was also allocated a 0% weighting, but
submission was required for successful completion of the subject. Two copies of student submissions were
generated – one stored on the Zope server for analysis, and a second compiled and e-mailed automatically to the
instructors and respondent. In addition to eliciting feedback on the tools and strategies used in the subject, the survey
also served to prompt students to reflect summatively on their experiences over the semester.
Methodology
A simple thematic content analysis approach was used to analyse the survey data. For each question, all
responses were first read at face value to produce a preliminary (candidate) list of themes or issues. This list was
gradually refined as subsequent passes were made through the data, with the content being reviewed in greater detail
and common strands factored out. As part of this iterative process, categories were added, deleted, renamed,
combined and divided as necessary.
Eventually, each response was categorised according to the themes/issues identified, to reveal those
themes/issues that appeared to be the most pertinent, or worthy of mention. It should be noted that the categories
were not mutually exclusive; some responses did not fall neatly into a single category, but rather spanned two or
more categories. Conversely, other responses did not fit into any of the categories at all and were thus assigned the
category “OTH” (Other).
These “distilled” themes/issues were then reported on in the sections that follow, with excerpts/quotes from
the actual survey data included to provide richer insight. The spelling, grammatical and punctuation errors in these
excerpts/quotes have deliberately not been rectified.
All in all, the aim of the process was to attempt to present a broad, overall or “birds’ eye view” picture of
student attitudes and reactions towards the CSCW subject, as seen in the feedback submitted.
Subject strengths
Table 1 shows the categories that emerged from an analysis of the subject strengths listed by students in
response to Question 1 of the survey.
Table 1. Summary of responses to Q1: “List what you consider to be the three strengths of the subject.” (N=30)
Cat. code Category description N %
COM Community-orientedness, collaboration and friendliness of atmosphere amongst students and 13 43.33
between students and teachers
LEA Learning knowledge and skills related to CSCW, group and groupware tools/technology 11 36.67
PRO Project 9 30.00
DIV Diversity of student cohort and POD workgroups 8 26.67
291
MOO Online practical sessions (held in MOO) 8 26.67
GRO Opportunity to work in groups and develop teamwork/collaboration skills 7 23.33
NOV Novel/unique experience 6 20.00
FLE Flexible, online nature of subject (ability to work from home, self-paced) 6 20.00
LEC Helpfulness and enthusiasm of lecturers 6 20.00
DIS Networking/interacting with other students from the same discipline 4 13.33
OLR Online Learning Record (OLR) 4 13.33
POD POD activities 4 13.33
REL Relevance and ability to apply learning to work situation 3 10.00
TOO Effectiveness of groupware suite/tools 3 10.00
IND Self-directed learning / learning at an individual level 3 10.00
CHA Challenging (eg. requiring self-motivation) 2 6.67
DED Inclusivity for distance education students 2 6.67
EXA No exam 2 6.67
NON None listed 0 0.00
The “COM” category had the largest number of responses associated with it (13 out of 30 students),
indicating that these students particularly enjoyed the collaborative, community-oriented nature of the subject, and
the high levels of interaction with their instructors and classmates:
“bring students together via a different medium”
“Friendly atmosphere between the students and lecturers.”
“Gobal discussions and view exchange”
It was also apparent that the subject content was well-received by the students, who highly valued learning
about the theory and practice of CSCW and groups, while being exposed to some of the many groupware options
available and being given the opportunity to learn how to use some of these tools. The project was the specific
learning activity that received most mention, with many students appreciating the ability to contextualise their
learning and apply it to their current and/or future vocations:
“The ability to base the project work on real work activities - makes it more meaningful and relevant…”
“Doing a project that was directly linked to something I was already involved in.”
Another one of the issues that spoke the loudest in the survey responses was the fact that students highly
valued the experience of interacting with others in the diverse cohort and workgroups:
“Not letting students chose their pod groups this was a great chance to meet students in the same situation as yourself. Especially
students from abroad.”
“Developing online communities with students from diverse backgrounds.”
“Networking with studints from different cultures and backgrounds”
At the same time, they benefited from interacting with those with similar interests, or from like disciplines.
A number of respondents commented on the novel learning experiences facilitated by the subject, in
particular the confluence of the human and technological facets of CSCW and online communities:
“introduction of unique learning opportunities/techniques”
“A bit of mystery as to where the subject was heading and the air of experienmentation”
This included the chance for them to work in teams, and to develop their “soft” skills to this end.
While students particularly enjoyed the collaborative, community-oriented nature of the subject, its
flexible, online features were also applauded:
“…the subject can be wholly completed online”
“Learn at your own pace”
“Flexibility with some deadlines/ongoing tasks(CSCW) so can reallocate time where needed.”
Another strength of the subject from the point of view of students was the helpfulness and enthusiasm of
the instructors, which helped create a supportive, community-oriented learning environment. This was further
underscored by the issue of inclusivity for distance education students.
There were also positive comments about the groupware tools used, including the level of innovation and
the variety of technologies explored. MOO, especially, was perceived by many as a strength, in terms of its ability to
provide an effective yet enjoyable means of facilitating synchronous collaboration and learning.
Hung & Nichani (2001) propose a constructivist framework that suggests e-learning environments should
be situated in both the social community of practice and in the individual minds of learners. For example, one
student listed learning how to collaborate using groupware tools and interacting with others from diverse
backgrounds as major strengths of the subject, but also pointed out that he benefited from the personal reflection
afforded by the OLR:
292
“You learn to create an online learning record, which in turn is learning at an individual level.”
Subject weaknesses
The categories of subject weaknesses identified are presented in Table 2.
Table 2. Summary of responses to Q2: “List what you consider to be the three weaknesses of the subject.” (N=30)
Cat. code Category description N %
TEC Technical issues/difficulties 11 36.67
POD Difficulty in coordinating POD groups and managing group dynamics/conflict 10 33.33
ACT Learning activities (eg. number of practical activities and case studies, volume and content of 8 26.67
readings)
CLA Clarity of assessment requirements / activity instructions 6 20.00
INT Internal lectures (appropriateness, schedule, attendance, content, etc.) 5* 16.67
FAC Lack of face-to-face contact 4 13.33
OTH Other 4 13.33
TIM Timing/scheduling issues related to online activities (eg. differences in time zones) 4 13.33
DIV Diversity of student cohort and POD workgroups 3 10.00
MOO MOO session organisation (chaotic, too much gossip, participants straying off topic) 3 10.00
ORG Organisation and structure of subject content 3 10.00
WOR Workload / time commitment required 3 10.00
FOR Lack of formative assessment and feedback/advice on project work 2 6.67
NON None listed 1 3.33
* 16 of the 30 respondents were enrolled in the subject in internal (on-campus) mode
The most commonly identified theme in the responses to this question pertained to technical
issues/difficulties, such as login problems and issues which arose from the high level of dependence of this subject
on the reliability of server and network infrastructure. The user-friendliness of one or more of the groupware tools
was criticised in some instances.
The technical problems were closely followed by the difficulty in coordinating and communicating with
POD group members. In many cases this seemed to be directly related to scheduling problems, possibly due to
differences in time zones. A number of groups were faced with members who failed to make adequate contributions:
“…after the first week we lost two of our POD group members, so their was [there were] only two of us that completed the tasks by
task 4, I think it was only me left in the group.”
“POD members who do not bother to reply or participate are a big problem.”
Although diversity was valued by some as a subject strength, others saw the mixture of students from
different disciplines within a single cohort in general, and within their POD groups in particular, as a disadvantage:
“Working with other students from different content/skills(backgrounds) less motivated and less helpful.”
“…the cohort was very diverse and were starting from very different knowledge bases and interest - this had some advantages but I
think more disadvantages”
It could be argued that many of these issues mirror the demands of computer-mediated communications
and collaborative groupwork in the real world, which was one of the original intentions of the subject. In fact, it was
hoped that students would document and reflect on these issues in their OLRs, bearing in mind they would not be
directly assessed on the effectiveness or activity level of their POD groups themselves. This having been said, more
support could be provided to students in the way of strategies for effective scheduling and organisation of online
meetings. There may also be a need to provide more motivation and encouragement to what appears to be the
minority of students, who failed to actively participate in the POD groups. Like O’Reilly and Newton (2002), the
authors believe that imposing requirements through assessment is not the only way to have students perceive
importance in online interaction and discussion.
A significant number of responses highlighted the fact that students sometimes found themselves unsure of
what exactly was required of them in certain activities and assessment tasks, and in general. This is a reminder of the
importance of clear, detailed and unambiguous instructions and guidelines, especially in an online/flexible delivery
subject. For on-campus students this can be alleviated to some extent by providing additional classroom-based
support, although the ideal level of face-to-face contact for students studying the CSCW subject is unclear. Some
students suggested that there was a lack of face-to-face support:
“…I realise this is an online subject but often not all problems can be answered online.”
On the other hand, others felt there was little point in holding face-to-face lectures:
293
“Internal lectures seemed silly for a subject where practicals and content were delivered online.”
A number of students listed the workload and time commitment required, in particular the large amount of
reading required, as a subject weakness. However, it should be realised that the nature of the subject is such that in
order to be successful, students must work consistently throughout the semester. To use a computing analogy,
students need to operate in “interactive mode” – Attempting to complete the required tasks just before the
assignment due dates, in “batch mode”, is simply not feasible! One student admitted:
“...the weaknesses I found in the subject were more related to my lack of discipline that problems in the actual subject.”
Table 3. Summary of responses to Q3: “List what aspects of the subject you consider to be most difficult.” (N=30)
Cat. code Category description N %
POD Coordinating POD groups 11 36.67
SCH Adhering to the subject schedule 11 36.67
TEC Resolving technical issues/difficulties, including learning/using one or more groupware tools 8 26.67
MOO Participating in and adjusting to MOO sessions (chaotic, too much gossip, participants straying off 5 16.67
topic)
OLR Maintaining the OLR and completing the [OLR] exercises 5 16.67
REA Completing the prescribed readings (due to the number, length, academic language and/or format 5 16.67
of the readings)
CLA Understanding the assessment requirements / activity instructions (due to lack of clarity, vagueness 4 13.33
and/or missing information)
PRO Completing the project 4 13.33
OTH Other 3 10.00
DIV Working with the diversity of the student cohort and POD workgroups 2 6.67
NON None listed 1 3.33
Once again, the resounding issue in terms of the aspects of the subject students found most difficult, had to
do with the organisation of POD groups. Students experienced difficult including initiating and maintaining constant
communications with members, scheduling meetings, encouraging participation, eliciting contributions and reaching
a consensus on topics of discussion. One student attributed his/her difficulties to:
“Having to work with people that had completely different goals and responsibilities”
Another student lamented:
“… Whilst everyone completed their work, we were often a member down when it came to discussing responces.”
One student reported that his group managed to overcome the difficulty of ensuring regular contact by
exercising good communication skills:
“…I also found it a bit difficult to catch up with my group members regularly due to the fact that the group ha internal and external
students. However, good communication skills that’s shown by every member of our group, solve that problem.”
Concerns in relation to the size of the workload were also reiterated in this section, with many students
finding it difficult to work constantly to stay up to date with the schedule amidst other personal, work and study
commitments:
“I found that checking the forum and my group page on a regular basis was the most difficult thing to do in this subject”
“The aspects… that i found most difficult were trying to find the time to complete every task on a weekly basis.. All i needed was a
big assignment and i fell behind having to catch up all the time”
“The most difficult thing, was staying in constant communication with my POD group, while trying to study for other subjects and
work.”
As mentioned earlier, discipline is required on the part of students to be consistent in completing the
weekly activities. Moreover, students found it challenging to multitask or simultaneously manage the various strands
of activities in the subject. Amongst the difficulties listed were:
“Juggling the streams of work - POD, OLR, Project whilst learning about MOO and ZOPE.”
“Unable to concentrate on a couple of items moving between POD activities, CSCW tasks, MOOs and project. Trying to familiarise
oneself with learning new computer skills and also compete tasks that require reading…”
Although the opportunity for real-time interaction in the MOO was previously identified as one of the
subject’s strengths, one student described her experience “mooing with over 20 students” as “chaotic learning”. This
had a lot to do with the overwhelming attendance in the evening session, particularly in the later weeks of the
semester, which a large proportion of the on-campus cohort began attending from home or the University’s on-
campus residences instead of, or in addition to, the daytime sessions.
294
Suggested improvements
Table 4 summarises the responses to Question 4, “List what improvements could be made to the subject.”
Table 4. Summary of responses to Q4: “List what improvements could be made to the subject.” (N=30)
Cat. code Category description N %
OTH Other 7 23.33
MOO MOO sessions – Make changes to the number of scheduled MOO sessions, change the topics 6 20.00
covered in MOO sessions, better organisation and more order/control in MOO sessions
ORG Improve organisation and structure of subject content and resources 6 20.00
POD Make changes to POD group setup and administration (group size, group composition, closer 5 16.67
monitoring/intervention by instructors)
TOO Changes to the groupware framework/tools 5 16.67
NON None listed 4 13.33
OLR Make changes to and/or update the content and/or focus of the [OLR] exercises 3 10.00
TEC Cater better for technical knowledge/skills gaps 3 10.00
WOR Reduce the workload size of the subject 3 10.00
ASS Provide more assistance and feedback with assessment work 2 6.67
CLA Provide clearer instructions/guidelines and criteria for activities and/or assessments 2 6.67
PRA Increase the number of hands-on practicals 2 6.67
REA Make changes to the prescribed readings (number, length and content) 2 6.67
As can be seen from Table 4, a large number of responses to this question were unable to be classified into
any of the identified categories and were therefore placed in the category labelled “OTH” (Other). However, a
noteworthy number of students made suggestions related to the scheduled MOO sessions Many students highly
valued this component of the subject, but expressed the need for more order to these sessions.
In this and preceding questions, there were complaints about the time and effort required to rationalise the
subject content and assessment requirements and organise them into a more manageable construction. This added an
unnecessary overhead, particularly at the beginning of the semester. Many expressed a need to improve the
organisation and structure of the content and resources, and take steps to ensure the consistency completeness and
accuracy of information. A degree of frustration was evident in some students’ responses:
“...Pertinent pieces of information were left off so that you spent hours doing trial and error to achieve what could have been done in
the first half hour if the instructions were correct…Old information on webspace that was incongruent with what we had to work with
in a practical session.”
“…I think I didn’t have sufficient time at beginning of course to extensively read before realising POD groups were going to demand
considerable time allocation.”
A number of students mentioned specific ways in which some of these concerns could be addressed to
improve the subject. Amongst these were recreating the (Zope) webspace so that it is in line with professional
learning areas, and developing a more informative and comprehensive subject outline to provide a learning
“roadmap” and an overview of the various resources.
The difficulty in organising POD groups arose again, with students calling for closer monitoring of POD
groups and lecturer intervention to facilitate the initial group setup. Some students also stated they would like to see
more technical assistance provided, particularly for the benefit of those from a non-Information Technology
background. For example, additional tuition or simpler, step-by-step instructions could have been provided for the
more complex tasks, such as Zope management and MOO building/ programming. One student said he/she would
like to see the use of less technical language in the documentation.
Reductions to workload and volume of prescribed readings were amongst the improvements suggested:
“OLR topic work needs to be reduced whilst the project is on – its a big work load…I am still catching up.”
“Need to rationalise course by deciding which computer skills/tools…to develop and what is to be learned.”
“…it took quite some effort and time to get through all the readings, and it got a little repetative towards the end of the subject”
Further comments
It made little sense to quantitatively analyse the responses to the final question in the survey, “Further
comments to add?” due to the extremely broad scope of this question. Many responses received here suggested a
sense of accomplishment and fulfilment by students in having completed the subject and achieving the intended
learning outcomes:
295
“...it was satisfying to complete the major project and my olr. Pod and olr activities provided sound challenges”
“Overall a nicely structured subject, with good teaching strategies, By studying this subject i clearly understood the principles of
CSCW, and how it can be applied in real time situations.”
“…I enjoyed completing each OLR and POD tasks. In the beginning it took some time for contacting each group members for
completion of tasks, but at the end we all understood each other very well and contributed our efforts. Thus this subject indeed teaches
us how to work in a group and also introduces us with new ways of communication…”
The unique learning opportunities and techniques of the subject received strong compliments again:
“I did really enjoy this subject and learnt alot. It intorduce me to a whole new learning experience through online collaboration.”
“…you don't even feel like you are completing a subject…”
“I took on this subject mainly out of interest – it sounded fascinating and it truly has been. Not only is it a new way of communicating
and working, but the subject is presented like no other…I have thoroughly enjoyed my time here.”
Specifically, the more technically oriented students benefited from the socio-cultural emphasis, and the
opportunity to hone their interpersonal and other non-technical skills. One student found the subject:
“…really enjoyable and completely left field from anything else I have done.”
Last but not least, the role of socialisation and friendship building in the success of the subject was given
mention in a number of instances:
“This is one subject that really allows students to come out of the class rooms and complete the subject with other fellow students in a
more friendly way.”
“I have learnt a lot from this subject and also made a lot of new friends which is very important. Collaboration and communication is
what this subject, is all about, after all!”
Further work
The students of CSCW play an important role in the knowledge generation for the rest of the class as well
as for and future cohorts. They therefore have a direct influence on the evolution of the subject and its content and
are encouraged to play an active role to this end. For example, the artefacts published by them on Zope remain
available to students who will study the subject in the future; the objects they have created in the MOO persist after
they have completed the subject.
The authors plan to further refine the groupware framework by experimenting with and evaluating other
tools and technologies. For example, a number of alternatives exist to cater for the subject’s content management
(Content Management System, 2005) needs; even Zope 3 (Zope Corporation, 2005a), is somewhat different from the
version used in the subject. Plone (Plone Foundation, 2005) is a powerful, user-friendly open source Content
Management System based on Zope.
The authors are also investigating the integration of Wiki into to further encourage collaborative knowledge
generation and sharing, by allowing students to annotate and contribute to the web-based lecture materials and
online subject content. Collaborative writing software may be introduced to assist groups of students working on
their project reports. Finally, the authors are exploring the dissemination of text and audio content through the use of
Really Simple Syndication (RSS). Most blogging systems, as well as Yahoo! Groups, are capable of generating RSS
feeds to syndicate XML data to subscribers. RSS 2.0 with enclosures allows for the syndication of audio content, a
technology known as podcasting. It will hoped that the use of RSS and podcasting will make mobile learning (m-
Learning) possible by catering for the delivery of instructor as well as student-generated content in the form of
small, “bite sized” learning moments viewable on handheld devices such as portable music players, mobile phones
and personal digital assistants (PDAs). For example, on-campus lessons and face-to-face discussions may be
captured in MP3 format and podcast for the benefit of all students. Students will be given the opportunity to engage
in collaborative activities using their personal mobile devices.
Furthermore, the authors will investigate the possible application of the online learning community
building framework proposed by Brook and Oliver (2003) in future offerings of the subject.
Conclusion
The authors believe that the CSCW groupware framework, as well as the teaching, learning and assessment
strategies, can be replicated or adapted for most computer education scenarios that will benefit from an online
community building and knowledge construction approach. They may have broader implications such as
contributing to best practice in this area.
Both the authors’ own observations and the student feedback received supply convincing evidence that the
subject and its organisation were well received by students. A detailed analysis of forum and MOO log data will be
carried out in order to determine the degree to which the role of instructors as active participants played an integral
part in building group harmony and confidence. In addition, the authors plan to study the importance and nature of
mentoring relationships in the building of an online learning community. It is envisaged that this will entail
discourse analysis of e-mail, MOO, forum and POD group data.
According to Delahoussaye (2001, cited in Differding, n.d.) online education is “an isolating and lonely
296
experience”. However, as one distance education student aptly observed:
“Studying via DE can either be an isolating experience or a real online community connection.”
The framework and strategies employed in CSCW go a long way towards building an inclusive learning
environment that causes students – both on-campus and distance education – to collaborate and connect, and
encourages them to evolve from social animals into true community creatures.
Acknowledgements
The authors would like to acknowledge the advice and guidance provided by Dr Barney Dalgarno and Dr
Yeslam Al-Saggaf during the writing of this paper.
References
Alavi, M. (1994). Computer-mediated collaborative learning: An empirical evaluation. MIS Quarterly, 18(2), 159-
174.
Bonk, C.J. & Wisher, R.A. (2000). Applying collaborative and e-learning tools to military distance learning: A
research framework. United States Army Research Institute for the Behavioral and Social Sciences.
Retrieved March 15, 2005, from http://www.publicationshare.com/docs/Dist.Learn(Wisher).pdf
Brook, C. and Oliver, R. (2003). Online learning communities: Investigating a design framework. Australian
Journal of Educational Technology, 19(2), 139-160. Retrieved March 15, 2005, from
http://www.ascilite.org.au/ajet/ajet19/brook.html
Bruner, J. (1986). Actual minds, possible worlds. Cambridge, MA: Harvard University Press.
Bruner, J. (1990). Acts of meaning. Cambridge, MA: Harvard University Press.
Bruner, J. (1996). The culture of education. Cambridge, MA: Harvard University Press.
Central Core (2004). Retrieved March 15, 2005, from http://coreblog.org
Chen, Y.-C. (2004). Building an online learning community. In Proceedings of the Association for Educational
Communications and Technology 2004 International Convention. Bloomington, IN: Association for
Educational Communications and Technology.
Clark, R.E. (1983). Reconsidering research on learning from media. Review of Educational Research, 43(4), 445-
459.
Coate, J. (1993). Cyberspace innkeeping: Building online community. Retrieved March 15, 2005, from
http://gopher.well.sf.ca.us:70/0/Community/innkeeping
Collis, B. (2002). The Teletop initiative: New learning, new technology. Journal of Industrial and Commercial
Training, 34 (6), 218-223.
Connery, B.A. (1988). Group work and collaborative writing. Teaching at Davis, 14(1), 2-4.
Content Management System (2005). Retrieved October 11, 2005 from
http://en.wikipedia.org/wiki/Content_management_system
Cunningham, D. J. (1996). Time after time. In W. Spinks (Ed.), Semiotics 95 (pp. 263-269). New York: Lang
Publishing.
CSCW/online communities groupspace. (2004). Retrieved March 15, 2005, from
http://ispg.csu.edu.au/subjects/cscw/
Dewey, J. (1929). The sources of a science of education. New York: Liveright.
Differding, G.A. (n.d.). Preparing students to join the online learning community. In B. Hoffman (Ed.), The
encyclopedia of educational technology. Retrieved March 15, 2005, from
http://coe.sdsu.edu/eet/Articles/stuprep/start.htm
Eustace, K. & Hay, L. (2000). A community and knowledge building model in computer education. In A.E. Ellis
(Ed.), Proceedings of the Australasian Conference on Computing Education (pp. 95-102). New York:
ACM Press.
Eustace, K., Henri, J., Meloche, J., Henri, F., Munro, R. & Weber, W. (2001). Experimentation in telelearning
environments: 3 case studies. In D. Watson & J. Anderson (Eds.), Networking the Learner: Computers in
Education: Proceedings of the Seventh IFIP World Conference on Computers in Education (pp. 963-966).
Dordrecht, Netherlands: Kluwer.
Fiechtner, S.B. & Davis, E.A. (1991). Why some groups fail: A survey of students' experiences with learning
groups. The Organizational Behavior Teaching Review, 9(4), 75-88.
Gunawardena, C.N., Lowe, C.A. & Anderson, T. (1997). Analysis of a global online debate and the development of
an interaction analysis model for examining social construction of knowledge in computer conferencing.
Journal of Educational Computing Research, 17(4), 397-431.
297
Holmevik, J.R. & Haynes, C. (2004). enCore Open Source MOO Project. Retrieved March 15, 2005, from
http://lingua.utdallas.edu/encore/.
Hiltz, S.R. (1995). The virtual classroom: Learning without limits via computer networks. Norwood, NJ: Ablex.
Hiltz, S.R. (1998). Collaborative learning in asynchronous learning environments: Building learning communities.
In H.A. Maurer & R.G. Olson (Eds.), Proceedings of WebNet 98 – World Conference on the WWW,
Internet and Intranet. Charlottesville, VA: Association for the Advancement of Computing in Education.
Hung, D. & Nichani, M. (2001). Constructivism and e-Learning: Balancing between the individual and social levels
of cognition. Educational Technology, 41(2), 40-44.
Kafai, Y. & Resnick, M. (1996). Constructionism in practice. Mahwah, NJ: Lawrence Erlbaum.
K9 campus and theme park. (2004). Retrieved March 15, 2005, from http://ispg.csu.edu.au:7680/
Kaye, A. (1992). Learning together apart. In A. Kaye (Ed.), Collaborative learning through computer conferencing
(pp. 1-24). Berlin: Springer-Verlag.
Lave, J. & Wenger, E. (1991). Situated learning: Legitimate peripheral participation. Cambridge, UK: Cambridge
University Press.
LC_MOO. (2004). Retrieved March 15, 2005, from http://ispg.csu.edu.au:7680/
O’Reilly, M. & Newton, D. (2002). Interaction online: Above and beyond requirements of assessment. Australian
Journal of Educational Technology, 18(1), 57-70.
Palloff, R. & Pratt, K. (1999). Building learning communities in cyberspace. San Francisco: Josey-Bass.
Peck, M.S. (1998). The different drum. (2nd ed.). New York: Touchstone.
Piaget, J. (1926). The language and thought of a child. London: Routledge & Kegan Paul.
Plone Foundation (2005). Plone: A user-friendly and powerful open source Content Management System. Retrieved
October 11, 2005 from http://plone.org
Rovai, A. (2002). Development of an instrument to measure classroom community. The Internet and Higher
Education, 5, 197-211.
Salmon, G. (2004a). E-Moderating, the key to teaching and learning online. (2nd ed.). London: Routledge.
Salmon, G. (2004b). The 5 stage model. Retrieved March 15, 2005, from http://www.atimod.com/e-
moderating/5stage.shtml
Smith, K.A. (1985). Cooperative learning groups. In S.F. Schmoberg (Ed.), Strategies for active teaching and
learning in university classrooms (pp. 18-26). Minneapolis: University of Minnesota Press.
Stein, G. (1937). Everybody’s autobiography. New York: Random House.
Syverson, M.A. (1995). Learning Record Online. Retrieved March 15, 2005, from
http://www.cwrl.utexas.edu/~syverson/olr
Teletop B.V. (2004). Teletop – enabled learning. Retrieved March 15, 2005, from http://www.teletop.nl
Vygotsky, L.S. (1978). Mind in society: The development of higher psychological processes. Cambridge, MA:
Harvard University Press.
Veerman, A. & Veldhuis-Diermanse, E. (2001). Collaborative learning through computer-mediated communication
in academic education. In P. Dillenbourg, A. Eurelings & K. Hakkarainen (Eds.), European perspectives on
computer-supported collaborative learning: Proceedings of the First European Conference on Computer-
Supported Collaborative Learning (pp. 625–632). Maastricht, Netherlands: Maastricht University.
Waddoups, G.L. & Howell, S.L. (2002). Bringing online learning to campus: The hybridization of teaching and
learning at Brigham Young University. International Review of Research in Open and Distance Learning,
2(2). Retrieved March 15, 2005, from http://www.irrodl.org/content/v2.2/waddoups.pdf
Walvoord, B.F. (1986). Helping students write well: A guide for teachers in all disciplines. (2nd ed.) New York:
Modern Language Association.
Whitehead, J. (2005). WebDAV Resources. Retrieved October 10, 2005, from http://webdav.org
Yahoo! Inc. (2005). Yahoo! Groups. Retrieved March 15, 2005, from http://groups.yahoo.com
Zope Corporation. (2005a). Zope 3. Retrieved October 10, 2005, from http://www.zope.org/Products/Zope3/
Zope Corporation. (2005b). Zope.org. Retrieved March 15, 2005, from http:///www.zope.org
298
Appendix B: List of “sponsored” project topics
1. Multimedia interface upgrade for LC_MOO and K9MOO
Re-development of graphics and other multimedia elements which form the interface for LC_MOO
(http://ispg.csu.edu.au:8800) as well as K9MOO (http://ispg.csu.edu.au:9000), including MUD maps for both
environments. Sponsors are Mark Lee and Ken Eustace.
2. Q&A project
Building a question and answer Web site for first year IT undergraduates, using a collection of newspaper articles in
XML format. Sponsors are Geoff Fellows and Ken Eustace.
3. Archiving policy
An investigation into the policy of archiving data and back-up procedures over time in an organisation eg a school,
business or government department. The sponsor is Prof. Ross Harvey.
4. Wiki as a collaborative learning tool
Wiki is a relatively new technology, used to facilitate collaborative web authoring. The most well-known Wiki
implementation is Wikipedia (http://en.wikipedia.org). This project will involve an exploration of the use of Wiki as
a collaborative learning tool in higher education. This involves some technical implementation as well as research.
Sponsor is Mark Lee.
5. 3D MOO development
Design and development of a 3D MOO using ActiveWorlds (http://www.activeworlds.com) to support collaborative
work in a particular field such as business or education. Sponsor is Mark Lee.
6. Open source groupware tools
An investigation of one or more open source groupware tools and/or the development of a framework using these
tools, to support a particular type of workgroup or community. Sponsor is Mark Lee.
299
Appendix C: Assessment criteria / marking sheet for CSCW project report
300
Figure 1. Stages/phases in online community building/growth and knowledge construction. Adapted from Salmon
(2004b) and Gunawardena, Lowe & Anderson (1997)
301
Figure 3. The top-level folder (home page) of the CSCW Zope site
302
Figure 4. The Bulga Ferngully room in LC_MOO, where the CSCW meetings and workshops were held
303
Figure 5. Yahoo! Groups
304
Figure 7. A typical [OLR] entry that appeared within the CSCW topic schedule on Zope
305
Figure 8. A log recorder object (behind) player object (in front) - MOO objects can be viewed directly in a browser
by specifying the relevant object number in the URL
306
Impact of Technology-Mediated Peer Assessment on Student Project Quality
Lan Li
Allen L. Steckelberg
University of Nebraska at Lincoln
Abstract
This study was designed to investigate the impact of an anonymous technology-mediated peer assessment
on student project quality in higher education. In this study, particular attention was paid to the quality of student
feedback and opportunities for students to improve their work. Forty-seven students from two undergraduate classes
from a central US university participated in this study. Students were randomly divided into two groups – a control
group and an experimental group. Before instructor assessment, peer assessment was conducted and peer feedback
was provided for students in the experiment group to improve their projects. The control group received no peer
feedback. An independent grader’s scoring of the two groups was analyzed to investigate the effect of technology-
mediated peer assessment on student project quality. Results indicated that there was a significant difference of
project quality between these two groups. A post assessment survey indicated that students generally held positive
perceptions of this peer assessment process. These findings supported previous studies that a well-implemented peer
assessment could promote student meaningful learning.
Introduction
Promoting student autonomy and shifting student roles from traditional passive observing to active learning
have become an important focus in higher education. Peer assessment, viewed by some researchers as “the learning
exercise in which the assessment skills are practiced.” (Sluijsmans, Brand-Gruwel, & van Merrienboer, 2002), is a
process in which peers assess the performance or achievement of others of the similar status (Topping, Smith,
Swanson, & Elliot, 2000). Peer assessment’s value in stimulating student motivation, promoting student critical
assessment skills and enhancing student meaningful learning has been established (e.g. Pope 2001, Freeman 1995 &
Topping 1998).
Most current peer assessment methods are conducted through paper-based systems. Two issues associated
with this system -- anonymity and administrative workload may hinder the widespread acceptance of this process.
A number of studies suggested that peer pressure is one of the causes of student negative feelings towards
peer assessment. Hanrahan and Isaacs (2001) reported student discomfort from peer pressure (associated with
having peers rating own paper and critiquing others). Falchikov (1986) found the “possibility of marking
down/failing a peer” as one of student least liked features in peer assessment. Chen and Warren (1997) noticed that
students “felt compelled to award a higher score to those with whom they were more friendly”. In a paper system,
potential bias caused by peer pressure (such as friendship) can influence students’ judgment and lead them to rate
good performance down and poor performance up. Therefore some researchers suggested providing anonymity to
reduce the impact of peer assessment (e.g. Davies, 2002). When peer assessment is conducted in a confidential
environment and assessors and assessees are not aware of each other, peer pressure should be substantially reduced.
Excessive administrative workload is another concern of some researchers (e.g. Davies, 2002). Hanrahan
and Isaac (2001), in their study, reported more than 40 person hours for documentation work to maintain an
anonymous paper-based peer assessment distribution system in classes with 244 students.
To overcome these two problems, Li and Steckelberg (2005) designed and implemented a database-
facilitated peer assessment support system. In this system, anonymity was provided and assessors and assessees
were not aware of each other’s identity. Projects were typed and submitted via the Internet; therefore the possibility
of revealing student identities and characteristics (such as gender) from their styles of handwriting was eliminated.
Students were instructed to remove any personal information from their projects. Projects were coded numerically
for assessment purposes. The distinctive features of database-driven website made it possible for student projects
and peer assessments to be submitted from students’ computers to a database, and at the same time become available
and accessible for students and instructors viewing. Management workload was substantially reduced.
This system was previously utilized by the authors (Li & Steckelberg, 2004) to investigate the impact of
anonymous peer assessment on student meaningful learning and student perceptions. Prior results presented an
interesting picture. Data analysis indicated that there was no significant difference of project quality of two groups.
307
However, students held general positive perceptions of this peer assessment method. They acknowledged and
recognized the merits of peer assessment in promoting meaningful learning and fostering critical thinking. After
scrutinizing each step of the peer assessment process and consider student perceptions, Li & Steckelberg suggested
that these seemingly contradictory findings might be explained in part by limited time for project improvement after
receiving peer’s feedback and the poor quality of student feedback. Some students in the post-assessment survey
indicated that they would like more time to improve their projects after viewing peers’ rating and comments. Others
expressed that they had expected more constructive peer comments. In this study, the peer assessment model was
modified to address these two issues. First, students in the experimental group were given more time to revise their
projects after peer reviewing. Second, students in the experimental group were informed that their assessment (of
peers’ performance) would be assessed by instructor and it would compose a part of their project grade, providing
incentive for students to put more effort into assessing peers’ work and into providing higher quality
feedback/suggestions.
Based on the findings of previous studies, our hypotheses are:
1. Anonymous peer assessment promotes student deeper understanding of subject matter and marking criteria.
There is a significant difference between project quality of the control and experimental groups.
2. Students acknowledge and recognize the values of this peer assessment method.
Facilitating website
In this study, the technology-mediated website (Li & Steckelberg, 2005) was utilized to provide anonymity
and facilitate instructor assessment the quality of peer’s assessment. This site contained two interfaces – student
interface and instructor interface. In the student interface, students performed two roles: assessors and assessees. As
assessors, students logged in and assessed (rated and commented upon) two randomly assigned peers’ projects. As
assessees, each student had immediate access to peer’s rating and feedback for his/her own project for further
improvement as soon as data were submitted. Shifting between two roles helped student gain better understanding of
project elements and marking criteria. This process was conducted in an anonymous environment. Assessors and
assesses didn’t need to face each other and all the data exchanges were through the website. The instructor interface
was designed to allow instructors to track the whole peer assessment process as well as managing/maintaining
student accounts. Instructors had access to students’ ratings and comments on peers’ work. Instructors used this
access to grade the quality of feedback provided by students in the experimental group.
Methodology
Subjects
Forty-seven undergraduate students from a central US university participated in this study. All the
participants were from a technology application course at the college of Education and Human Science.
Procedures
As a class assignment, students were asked to build a WebQuest project and upload it to the Internet. A
WebQuest is an instructional material utilizing web resources. It was designed to involve users in a learning process
of analysis, synthesis and evaluation. Peer assessment was utilized in this study to help student gain deeper grasp of
critical features of WebQuest and better understanding of the marking criteria.
Students were randomly divided into two groups – a control group (twenty-three students) and an
experimental group (Twenty-four students). In the control group, after thoroughly studying the project elements and
marking criteria, students were instructed to develop a WebQuest project by themselves without any external
intervention. In the experimental group, peer assessment intervention was conducted (Figure 1).
308
Figure 1. Peer assessment steps in control and experimental groups.
Post-assessment survey
After students submitted their projects, students in the experimental group were asked to fill in a post-
assessment survey concerning their general perceptions of this technology-mediated peer assessment method. This
survey was adapted from previous study (Lin, Liu, & Yuan, 2002) and included 15 five-point Likert Scale items
with 1 representing “Strongly disagree” and 5 representing “Strongly agree”, as well as two open ended questions
concerning students best and least like features.
Scoring procedure
One trained independent rater graded all student projects based on the same rubric (Dodge, 2001). This
rubric included 13 items with a total point of 50. The whole scoring process was conducted in an anonymous way.
309
Projects from the control group and experimental group were coded and mixed together for the independent rater to
assess. The independent rater didn’t know which project was from which group and had no way to connect projects
with individual students. To make sure that the scoring process was consistent, reliable and free of bias, inter-rater
reliability of course instructor’s scoring and independent rater’ scoring was calculated. The Pearson correlation was
.829 (significant at .01 level).
Results
Two types of data were collected in this study. First, student project scores from the independent rater were
collected to compare the difference of project quality between two groups. Second, post assessment survey data
were gathered to depict student attitude toward this anonymous technology-mediated peer assessment method.
Table 1. Mean and standard deviation of the final project scores for the experimental and control groups
Group N Mean SD
Experimental 24 37.67 9.70
Control 23 21.57 8.32
As Table 1 indicates, the difference between the two means (37.67 vs. 21.57) is statistically significant, F (1, 45) =
37.083, p < .01.
An error bar graph (Figure 2) shows that the scores in the experimental group were much higher than that of the
control group.
310
Table 2. Minimum, Maximum, Standard Deviation and mean of student perceptions in post-assessment Survey
⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯
95% Confidence Interval of
the Difference
⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯
Items Min Max SD Mean Lower Upper
⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯
1 I am content with my own work. 2 5 .75 4.07 3.73 4.41
2 I learn more from peer assessment than 2 4 .77 3.10 2.75 3.45
from traditional teacher assessment.
3 The procedures on how to do peer 3 5 .59 3.95 3.68 4.22
assessment are clearly outlined.
4 Peer assessment is a worthwhile 2 5 1.02 3.67 3.20 4.13
activity.
5 Peers have adequate knowledge to 2 5 .91 3.33 3.92 3.75
evaluate my work.
6 I benefited from peers’ comments. 2 5 .86 3.95 3.56 4.37
7 The peers’ comments on my work were 2 5 .89 4.00 3.59 4.41
fair.
8 Peers can assess fairly. 3 5 .71 3.88 3.56 4.20
9 I have benefited from marking peers’ 2 5 .85 3.86 3.47 4.25
work.
10 I took a serious attitude towards 2 5 .75 4.48 4.14 4.82
marking peers’ work.
11 I felt that I was critical of others when 3 5 .71 4.00 3.68 4.32
marking peers’ work.
12 I had enough time to assess peers’ 3 5 .72 4.29 3.96 4.61
work.
13 I had enough time to improve my work 3 5 .75 4.19 3.85 4.53
after I got feedback form peers.
14 I have the knowledge to assess peers’ 1 5 .96 3.86 3.42 4.30
work accurately.
15 My project improved because of the 2 5 .86 4.05 3.65 4.44
peer review.
⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯
Data analysis of the first open-ended question, “Please specify what you like most in this peer assessment
procedure” revealed two major themes: peers’ feedback for improvement and the opportunity to view peers’
projects. Another two themes emerged for the second open-ended question, “How would you change this peer
assessment procedure? And why?” (Table 3)
Table 3. Themes and supporting raw data from the post assessment survey
“Please specify what you like most in this peer assessment procedure”
311
things I forgot.”
“I liked being able to get feedback and then be able to change it.”
“Gives us a chance to see what we did wrong and improve it.”
…
Opportunity to view peers’ projects “you got to see other projects.”
“that I could see other peer’s projects and get ideas form them.”
“I think it is nice that you can see others’ work, so you have an idea
where your abilities lie.”
…
“How would you change this peer assessment procedure? And why?”
Conclusion
This study investigated the impact of an anonymous technology-mediated peer assessment support system
on student project quality and explored student attitudes toward this method. Data analysis revealed a statistically
significant difference in project quality between the control and experimental groups. A post assessment survey also
indicated that students generally held positive perceptions of this peer assessment process.
In the implementation process, we felt that a few critical features of peer assessment had an influence on its success.
Time to improve projects. Students should be given enough time to improve their projects after viewing peers’
gradings and comments. Both feedback and the opportunity to act on that feedback are important to improve
student work. This issue was especially addressed in this study. In the post assessment survey, students
indicated that they had enough time assessing peers’ work as well as improving their own projects after viewing
peer feedback.
Incentive to provide quality feedback. Student incentive is essential for the success of peer assessment. When
students put effort into reviewing peers’ projects, they provide more constructive feedback, which is important
in allowing students to benefit from the process. In this study, the quality of peer assessment was assessed by
instructor to encourage higher quality peer feedback. In the post assessment survey, most students indicated that
they took a serious attitude towards marking peers’ work.
Technology-mediated support. The technology-mediated peer assessment support system played an important
role in this study. All the data were automated and transmitted back and forth between a database and students’
computers. Information exchange was extremely prompt and easy, which saved a substantial administrative
workload and stimulated student interactions. This system also facilitated instructor assessing the quality of peer
feedback. Once the instructor logged in the instructor interface, she had access to detailed assessments made by
each student, which made the process extremely easy and prompt.
Anonymous peer assessment. Anonymity, another advantage of this system was acknowledged in student
survey responses. Most students felt that they were critical of others when marking peers’ work (survey item 11;
mean = 4.00; SD = .71). Students appreciated the “freedom to give comments that came from anonymity”,
which would certainly minimize the potential impact of peer pressure, thus maintain the reliability and validity
of peer assessment.
The findings from this study support previous findings that a well-implemented peer assessment can help student
gain deeper understanding of subject matter and promote student learning.
References
Chen W., & Warren, M. (1997). Having second thoughts: student perceptions before and after a peer assessment
exercise. Studies in Higher Education. 22(2).
312
Davies, P. (2002). Using Student Reflective Self-Assessment for Awarding Degree Classifications. Innovations in
Education and Teaching International, 39(4), 307-319.
Dodge, D., (2001). WebQuest Rubric, Retrieved October 5, 2005, from
http://webquest.sdsu.edu/webquestrubric.html
Falchikov, N. (1986). Product comparisons and process benefits of collaborative peer and self-assessments.
Assessment and Evaluation in Higher Education, 11, 146–166
Freeman, M. (1995). Peer Assessment by Groups of Group Work. Assessment & Evaluation in Higher Education,
20(3), 289-300.
Hanrahan, S. J., & Isaacs, G. (2001). Assessing Self- and Peer-Assessment: The Students' Views. Higher Education
Research & Development, 20(1), 53-70.
Li, L., & Steckelberg, A. L. (2004). Using Peer Feedback to Enhance Student Meaningful learning. Proceedings of
the 2004 annual AECT conference. Chicago, IL.
Li, L., & Steckelberg, A. L. (2005). Peer assessment support system (PASS). TechTrends, 49(4), 80-84.
Pope, N. (2001). An Examination of the Use of Peer Rating for Formative Assessment in the Context of the Theory
of Consumption Values. Assessment & Evaluation in Higher Education, 26(3), 235-246.
Sluijsmans, D. M. A., Brand-Gruwel, S., & van Merrienboer, J. J. G. (2002). Peer Assessment Training in Teacher
Education: Effects on Performance and Perceptions. Assessment & Evaluation in Higher Education, 27(5),
443-454.
Topping, K. (1998). Peer Assessment between Students in Colleges and Universities. Review of Educational
Research, 68(3), 249-276.
Topping, K. J., Smith, E. F., Swanson, I., & Elliot, A. (2000). Formative Peer Assessment of Academic Writing
between Postgraduate Students. Assessment & Evaluation in Higher Education, 25(2), 149-169.
313
Factors that Influence Learners’ Performance
in a Goal-based Scenario e-Learning Environment
Kyu Yon Lim
Hyeon Woo Lee
Pennsylvania State University
Il-hyun Jo
Chuncheon National University of Education
Abstract
The Goal-based Scenario (GBS) model, which is renowned for its rigor in developing high-quality, highly-
transferable courseware, has been attracting instructional designers' keen attention for several years. This study
tries to share the process of developing e-learning courseware which employs GBS, and investigates the
effectiveness of the GBS model in corporate settings in terms of learners’ satisfaction and performance. The
researchers pursue theoretical as well as practical implications for instructional design practitioners by conducting
quantitative and post-hoc qualitative research.
Introduction
Korea's corporate training community has a need for effective, performance-oriented e-learning courseware
to help companies cope with and survive in today's hype-competitive market environment. In order to provide the
learning tasks and resources that reflect the context of the “real” world, the need for constructivist instructional
models has recently come to the fore (Schank, 2002). The Goal-based Scenario (GBS) model is renowned for its
ability to embrace both constructivism and objectivism, and has been attracting the attention of instructional
designers who are interested in developing high-quality, highly-transferable courseware (Jo, 2002).
314
employs a GBS model, and to empirically investigate critical factors in learners’ performance in GBS e-learning
environments. Two separate studies were conducted. Initially, the researchers analyzed learners’ characteristics and
design strategies that influence learners' satisfaction and achievement,. Secondly, to further investigate the reasons
that could have resulted in the findings discovered in the first study, the researchers conducted post-hoc qualitative
research. Research questions were as follows: 1) What level is the learners’ satisfaction and achievement with the
GBS-based e-learning courseware? 2) What factors are related to the learners’ satisfaction and achievement in the
GBS-based e-learning environment? 3) Are there any other factors related to the learners’ achievement in the GBS-
based e-learning environment?
Design Principles
The company adopted several critical design requirements for the six courses. First, curriculum of the CE
Academy Jr. should be developed according to the needs of job-site employees by conducting a survey and focus
group interviews. Second, courseware should be developed for an online environment to provide all the employees,
including people who work in the overseas site, with equal opportunity for high-quality training programs. Third, an
instructional strategy that could enable learners to develop problem-solving skills should be adopted to help them
solve the real-world problems that happened in their construction sites everyday, not just understanding the concepts
behind new technologies. GBS model was selected to meet this need. Finally, a tutoring plan for effective leaning
should be pre-designed. As GBS is task-oriented by nature, continuous facilitating by experienced experts is one of
the most important learning resources.
Operational Strategy
Each courseware product consisted of four to seven learning tasks that should be performed within eight
weeks. Learners could complete the course study once they achieved more than 70 points (total 100 points) in three
categories: the rate of progress, quizzes, and performance level of tasks. Besides, completion of “at least one course
in the CE Academy Jr.” was a requirement for promotion to managerial positions. That is, it was regarded as
compulsory courseware.
Development Process
For this challenging project, the researchers and instructional designers modified GBS to meet the specific
design requirements for a Korean business situation. The modified GBS, which we call GBS+, is an instructional
design model that is embodied by the proactive communication between subject matter experts and instructional
designers. GBS+ uses a specified taxonomy and terms which are familiar to Korean instructional design
practitioners, presents practical guidelines for each step, and most of all, describes a conceptual framework for GBS-
based e-learning courseware to diffuse this hybrid model which blends constructivist design ideas with an objectivist
systemic approach. The five development steps embodied in GBS+ are as follows.
(1) Identify Learning Goals
Learning goals are sets of skills that learners should learn. At the beginning of the analysis phase,
instructional designers, subject matter experts, and star performers had a series of workshops to share
the idea of GBS and define the learning goals. Novak’s knowledge-mapping method was used to
describe the structure of knowledge and skill sets which were essential for performing the job task in a
real situation (Jo, 2001; Novak, 1999). Consensus among all parties was another issue. The output of
this stage was a concept map which showed the hierarchical and procedural relationships among the
knowledge nodes.
(2) Design and Develop Learning Tasks
Tasks are the key to success of GBS+. As motivation of learners depends on the authenticity and
relevance of tasks (Petraglia, 1998), researchers aimed at developing plausible and meaningful tasks
that encompass the pre-described learning goals.
(3) Create Scenarios
Scenarios are closely related to the learning goals and tasks. In addition, scenarios contain learning
resources which are needed to complete the tasks (Schank, 1992). Star performers and subject matter
experts cooperated to create scenarios to motivate learners with not only realistic but also dramatic
315
stories based on real cases. The result was the construction of a Master Scenario and Sub Scenario,
both of which are brand-new terms created by GBS+.
(4) Develop Learning Resources
Three types of learning resources were developed to scaffold learners to perform the given tasks:
“Tutorials” for understanding the content and process knowledge, “Glossaries” for catching the
meaning of terms, and “Data” for solving the problem, especially similar to real documents and
information resource that are easily found in the job-site. All parties worked together to embed these
resources in scenarios to maintain the contextual and cognitive liaisons among task, scenario, and
learning resources.
(5) Develop Storyboards and Media
Instructional designers developed storyboards according to micro-level design strategies such as
message design principles and the ARCS model. Media development followed this stage.
Quantitative Study
Method
157 employees of SECC Participated in this study and 105 employees answered the survey instruments.
The independent variables were the level of learners’ self-regulated learning skills and perceived level of
authenticity of tasks. A revised MSLQ (Motivated Strategies for Learning Questionnaire) was used to measure the
former (Pintrich, 1986). Learners’ self-regulated learning skills were categorized into sub-components of motivation
(internal motivation, external motivation, perception on tasks) and learning strategy (organization, metacognition,
time management, effort control). For the perceived authenticity of the tasks, instruments developed by Roelofs &
Terwel were revised to fit the needs of this study, summarized into three sub categories of reality, contextuality, and
learner control. The dependent variables were learners’ satisfaction and learning achievements. Survey instruments
on satisfaction tried to measure the satisfaction level resulting from GBS-based e-learning courseware. Two
variables were adapted to measure learning achievements: understanding level and performance level. Several
multiple regression tests were utilized for the analysis of data.
Results
Research Question 1: What level is the learners’ satisfaction with the GBS-based e-learning courseware?
Survey instruments for learners’ satisfaction were organized into 4 categories: items on scenarios, learning tasks,
general satisfaction, and satisfaction compared with tutorial-based e-learning courseware. The result shows that
students were satisfied with the GBS-based e-learning courseware in general (m=3.83). Above all, learning tasks
were the area in which respondents had the highest satisfaction (m=3.91).
Research Question 2: What factors are related to the learners’ satisfaction and achievement in the GBS-
based e-learning environment? Multiple regressions were conducted to examine the factors that influence learners’
satisfaction (Table 2). Study results show that the level of perception on authenticity of tasks (Beta=.84, p<.001) and
learners’ self-regulated learning skills on motivation (Beta =.12, p<.05) were the independent variables which
predicted learners’ satisfaction.
316
Table 2. Multiple Regression Analysis for Learners’ Satisfaction
<n=105>
Independent Variables Adjusted R2 Beta t p
Perception of Authenticity of Tasks .84 16.12 .000***
.836
Self-Regulated Learning Skills on Motivation .12 2.32 .022*
Self-Regulated Learning Skills on Learning Strategies .06 1.23 .222
F value of the model = 266.27, p>Model F = .000***
***p<.001, ** p <.01, * p <.05
Another multiple regression analysis was performed to investigate the influence of sub-factors. The results
indicate that sub-factors included in authenticity of tasks were positively related to learners’ satisfaction (Table 3).
Analysis for learners’ achievements was performed for the next stage. The result of multiple regression
analysis for the level of understanding indicated that there were no statistically significant factors that predict the
dependent variables. There were similar results for the level of performance; no factors predicted learners’
achievements.
To sum up, learners’ satisfaction with GBS-based e-learning courseware was closely related to the learners’
perception of the authenticity of the assigned tasks and their self-regulated learning skills. Learners’ achievements,
on the other hand, had nothing to do with any independent variables. Therefore, a post-hoc qualitative study was
conducted to investigate the factors related to the learners’ achievements.
Qualitative Study
Method
Ten learners and six tutors participated in the qualitative research. Learners were purposively sampled by
their satisfaction and achievement scores. The qualitative study was conducted mainly by a semi-structured
interview questionnaire designed to evoke new information and suggestions. Every comment and word was recorded
and subjected to a content analysis.
Results
Research Question 3: Are there any other factors related to the learners’ achievement in the GBS-based e-
learning environment? Learners who reported the highest scores in both satisfaction and achievement had the
opinions that the learning materials and given tasks were authentic enough to transfer to their job tasks. They
participated proactively in the learning process by searching information on the Internet or asking questions to their
experienced seniors. Additionally, they had abundant pre-acquired knowledge related to job performance.
“At first, I enrolled in this course because it was a requirement for promotion. But I found it useful to
my job. Tasks and learning resources were interesting and looked like real situations.”
317
“Tasks were a little bit tough but I could solve them by collecting and analyzing data around my
office. I have already experienced that kind of situation before, so I know the way it goes.”
On the contrary, learners whose scores were the lowest in satisfaction and achievement had not been
motivated since the given tasks had nothing to do with the job they were performing at that time, although they
regarded the GBS model as being effective and helpful. The only factor driving them to complete the course was the
characteristic of CE Academy Jr., which was “compulsory for promotion.”
“I would have given up finishing the course, if it had not been required. I logged in just to
complete!”
“All I need was 70 points to complete the course. Why do I have to do my best?”
The results of the one on one tutor interviews have some implications on the factors related to learners’
achievements. Six tutors had similar opinions that learners’ job experience had a strong effect on performing tasks
due to the nature of tasks. It was also said that most learners focused on “completion.”
“The most important thing for learners was to finish the course. For example, some learners just
selected easier tasks to get 70 points. There was no need to solve all the problems in that
courseware.”
Interview results indicate that learners’ achievement is closely related to the rules and regulations within
the Human Resource Management system. In this case, completing the courseware was equal to obtaining the points
for promotion.
Discussion
Learners’ satisfaction can be predicted by self-regulated learning skills on motivation, corresponding to the
previous research results (Pintrich & DeGroot, 1990; Schunk, 2000). Interview with learners indicated that high
internal motivation resulted in high satisfaction levels.
It is the authenticity of tasks that counts in this study. Authenticity explained more than 80% of learners’
satisfaction with GBS-based e-learning courseware. The result of qualitative research shows consistency with
statistical analysis. All the learners who were satisfied with the courseware commented that the tasks were highly
useful in performing their current job tasks.
None of the independent variables, however, predicted learners’ achievement. Qualitative research results
indicate two reasons for that. First, courseware in CE Academy Jr. was requirements for promotion, which drove
learners to focus on completing the course. The company was concerned with the list of learners who finished the
course, not with the performance itself. Second, learners’ job experience and pre-acquired knowledge had strong
effects on their achievement. It is mainly because GBS is inevitably based on real job tasks. This authenticity allows
pre-acquired knowledge to help learners perform their real tasks.
Based on these findings, the study suggests that, for instructional design, authentic tasks are critical to
maximizing learning outcomes. The operational implications are that courseware should be connected to the HR
systems of companies, an issue that should be considered at the stage of analysis and macro-level design.
References
Campbell, R. & Monson, D. (1994). Building A Goal-Based Scenario Learning Environment. Educational
Technology, 34(9), 9-14.
Jo, I. (2002). Case Study on Developing GBS-based Instructional Model. Korean Journal of Corporate training,
Feb.-Apr.
Jo, I. (2001). The Effects of Concept Mapping on College Students' Comprehension of Expository Text. Unpublished
doctoral dissertation, Florida State University.
Jona, K. (2000). Rethinking the Design of Online Courses. Paper presented at the ASCILTE 2000 Conference.
Novak, J.D. (1999). Learning, Creating, and Using Knowledge: Concept Maps as Facilitative Tools in Schools and
Corporations. Hillsdale, N.J.: Lawrence Erlbaum.
Petraglia, J. (1998). Reality by Design. Lawrence Erlbaum Associates.
Pintrich, P. R. (1986). Motivation and Learning Strategies Interaction with Achievement. Paper presented at the
annual meeting of the American Educational Research Association, San Francisco.
Pintrich, P. R. & DeGroot, E. V. (1990). Motivational and Self-Regulated Learning Components of Classroom
318
Academic Performance. Journal of Educational Psychology, 82, 33-40.
Schank, R. (1992). Goal-Based Scenarios. Technical report. Evanston, IL : The Institute for the Learning Sciences.
Northwestern University.
Schank, R. (1999). Dynamic Memory Revisited. Cambridge University Press.
Schank, R. (2002). Designing World-Class e-Learning. New York: McGraw-Hill.
Schank, R., Berman, T., Macpherson, K. (1999). Learning by Doing, Ch. 8, in Instructional Design Theories and
Models: A New Paradigm of Instructional Theory, vol. II. C.M. Reigeluth (ed.) Mahwah, NJ: Lawrence
Erlbaum Associates
Schunk, D. H. (2000). Learning Theories; an Educational Perspective. NJ: Prentice-Hall.
319
Fostering Communities of Practice –
A Case Study of Heads of IT Departments
Wei-Ying Lim
John G Hedberg
Jennifer Ai-Choo Yeo
David Hung
Nanyang Techchological University, Singapore
Abstract
This paper describes a project which sought to foster communities of practice in Singapore schools both as
a culture and as a professional development strategy. We adapted such a concept to one of the leadership training
modules in NIE using Wenger’s evolving community as a guiding framework. Our findings have been mixed thus
far, and relate to the complexities of fostering CoPs and why participants would seek to continue collaboration in
non formal settings. At this stage, we have found that two main types of preconditions, personal imperatives and
nature of tasks need considerations before enactment. The personal imperatives that individuals bring into the
community determine the density of connectivity in a CoP which in turn affects the undertaking of complex tasks.
Introduction
Currently, through conversations, interactions and sharing sessions with teachers and school leaders,
knowledge is typically tapped on for problem-solving, and through such approaches groups have access to the
accumulation of years of experience. Experience is knowledge that is deeply entwined with the context in which it
occurs, encountered only by the first-person(s) in the situations. Thus, knowledge is very much tacit in nature and it
resides in the individuals who ‘own’ it. Paradoxical as it may seem, schools can look within and across their
organization for solutions, rather than asking external providers to provide such tacit knowledge. Indeed
professional development in the Singapore schools’ context has included a rationalization of the teachers’ workload
such that it can give the experienced teachers “time to coach the younger teachers and help them to absorb the ethos
and values of the profession. That way, overall quality goes up in teaching”(MOE para. 42, 2005).
The need for a situative approach to professional development and knowledge sharing has led teachers,
through “legitimate peripheral participation” (Lave & Wenger, 1991) to participate and observe the community’s
activities on the periphery, and appropriate knowledge (particularly tacit knowledge) from the more experienced (or
central) participants. Progressively, teachers move from peripheral participation to central participation where they
change from being passive observers to active contributors in the community. Through this process, teachers
gradually acquire the skills, norms and rules held by the community of practice (Hung, 1999).
320
Wenger’s Rationale Design Plan for fostering HoD ICT Community
tenets
• Monthly face-to-face or online session after the formal
training at the University
Leadership To help the community develop • To facilitate the evolvement of leaders as they begin to take
ownership and responsibility of core topics or tasks
• To allow leaders to take ownerships in facilitating
discussions
Connectivity Enable a rich fabric of connectivity • To connect different batches of HoDs with other
among people groups/communities such as School Principals
Membership To foster a sense of belonging • To elicit common goals amongst all members
Learning To deepen mutual responsibility • The Project Plan
Projects • To introduce online discussions
• To introduce topics for discussion that interest sub-groups
interests
Artifacts Documents, tools, stories, symbols etc • To encourage the sharing of experiences and adopt success
that represent the community. stories and examples of personal experiences of HoDs
321
July2003 cohort Jan 2004 cohort
from course lecturers and ETD facilitators in class
• Summarising whole class discussion/ group • Presentation of group projects
sharing/lesson • Feedback from course lecturers and ETD
• Sharing of food in class facilitators
• Summarising whole class discussion/ group
sharing/lesson
Activities Online • Individual online reflection • Individual online reflection
• Discussion of spillover • 3 discussion forums for each project group
topics/issues/questions from face-to-face • 3 forums for discussion of issues surfaced in
session f2f sessions
• Online discussion forum set up for group • No facilitation from NIE course lecturers
discussion, negotiation and reflection of their • Facilitation from ETD facilitators in some of
project the online discussions, mostly to direct ppts
• In all the online discussions, there was no answers to some point.
facilitation from the course lecturers or the
ETD facilitators. One of the forums
(professional development) is facilitated by
one of the participants.
Activities —Special • Dialogue with Director Educational • Attachment to various IT commercial
Events Technology Division companies
Supports • Classroom • Classroom
1. Feedback on the online discussion 1. Group discussion facilitated by
2. External speaker invited to speak ETD facilitators
on CSCILE 2. Feedback on the online discussion
• Online • Online
1. Facilitation by appointed 1. Facilitation by ETD facilitators
participant for one discussion • Project
thread 1. Advice and feedback on groups’
• Project initial project ideas by course
1. Advice and feedback on groups’ lecturers and Education Ministry
initial project ideas by course facilitators
lecturers 2. Online discussion facilities
2. Online discussion facilities provided for group’s discussion
provided for group’s discussion and negotiation.
and negotiation.
Tools • In class • In class
o PC, projector for presentation; o PC, projector for presentation;
Reading materials; Laptops Reading materials; Laptops
• Online • Online
o Blackboard o Blackboard
For each cohort, ethnographic notes were made whenever the community met face-to-face and online
discourse were analyzed contextually based on the events and activities organized or evolving at that particular
instance. A historical time frame or developmental approach was adopted which traced each community from its
beginnings to its maturity stages. In essence, the transformations and processes undertaken by the evolving nature of
the community were documented, analyzed, and the processes relevant to the transformation process abstracted.
Findings
The following findings are presented according to teacher cohorts.
Participant A:
322
After the first session... , I felt relief. I think I will be able to pick up a lot from my fellow colleagues.
They are all so good and most important of all is they are willing to share. I feel at ease with the lecturers
and tutors too. The job that they are doing is similar to what the IT HOD is doing in school. Hence they
will be able to associate with the kind of pain that we actually are going through when trying to implement
new initiatives or programs to the staff of the school. And be able to share with us their experiences.
Participant B:
I look forward to our learning together and collectively we can expound, explore and evolve good ideas
that will optimize the use of current and limited resources for the “engaged learning” movement... The
intent to extend the spirit of the course and the camaraderie developed beyond the duration of the DDM is
something that will bind the HoDs together and instill pride for playing a special role to lead the school to
sustain this educational reform.
Participant C:
… There is much to learn about fostering learning communities in schools, making authentic assessment
work, working with students and teachers as knowledge producers and making project work to work in
schools. I am looking forward to crafting out practical project proposals that will be implemented in our
schools next year.
Upon analyzing the participation in the online and face-to-face discussions, the results revealed some
interesting and somewhat conflicting insights. There were a total of five online discussion forums and another three
for project group discussions. The level of participation for each forum was measured in terms of the number of
postings for each forum. The number included those posted by the participants as well as the Ministry facilitators
and University lecturers.
30
25
20
15
10
5
0
up
P2
up
gt
n
PD
s
LC
nd
io
M
ro
ro
M
ct
ie
g
G
e
G
le
in
rc
Fr
c
ef
y
er
ou
Se
ar
ur
R
st
im
es
Fo
Fo
Pr
R
323
one another’s implementation problems beyond the official course duration. In fact, three of the six worked closely
to conduct a sharing session for the July 2004 cohort of participants as well as a presentation at a local education
conference. According to one participant, these events brought them closer to each other.
During a post-implementation interview, the respondents revealed that the project was being sustained in
their schools with more teachers and departments on board. When queried on why this was the case, all mentioned
that it was their desire to see the project through and their strong belief in what they were doing that saw them
through the difficulties. They also mentioned that the “gelling” among them assisted in their collaboration a number
of times, both during face-to-face and the online discussions.
16
14
12
10
4
2
0
G
HS
AT
TV
H
LT
JL
AL
T
K
EC
C
A
SW
AS
C
PE
YK
D
R
G
H
C
Leadership
True to Wenger’s tenets, another factor in the success of the successful group was the presence of a natural
leader, JL who not only had the highest level of participation online, but was also observed to be actively
participating during the face-to-face sessions, offering to share resources acquired from conferences and taking the
initiative to compile summaries of the class. JL displayed leadership qualities, a keenness for appropriate use of
technologies and to share experiences with others.
324
Secondary Group Four Friends Group Primary Group
for the students • Supporting the teacher facilitator
• Training the students for the
collaborative discussion
• Supporting the teacher
facilitators
Complexity Low. Technical aspects may be High. Technical aspects may be High. Both technically and pedagogically
of task complex but can be outsourced complex but within the challenging
to external vendors. capability of the HOD (IT).
High degree of uncertainty in terms of
Required the collaboration of all processes and outcomes. Thus working
the members as the online with others provides the support needed.
discussion involved students’
participation from different
schools
Participation Individual Group / Division of labor Individual / Sharing of experience
structures
Support and External vendors Division of labor External vendors
tools Experience of team members
Participant A:
In the spirit of Learning Organisation, I believe in sharing our ideas & resources. To start the ball rolling, I
will like to share with the 011h group of participants my school's IT Dept Workplan which my team of IT
savvy teachers & I are currently working on. Please click on at
http://www.xinminss.moe.edu.sg/chiakh/default.aspx. Will really appreciate if you could give comments on
how it could be improved further…
Participant B:
… Thanks for your advice. In fact, one of my strategies in my IT action plan this year is exactly what you
have described. I'm glad to hear from you that it worked in your school. That gave me more confidence in
implementing it.
40
35
30
25
20
15
10
5
0
A
t
1
g
en
h
se
PD
in
o
k
ur
ul
e
rn
we
we
we
we
iB
ch
co
ea
ta
e
eL
e
n
ng
At
th
io
io
io
io
ct
ct
ct
Su
ct
of
le
le
le
le
n
ef
ef
ef
ef
io
R
ct
le
ef
R
325
From Figure 3, the level of participation for each forum was very low and it decreased as the course
progressed. Like cohort 1, the participants did not really see the need for online discussion as they see each other
everyday during the course. This phenomenon persisted despite the fact that cohort 2 had facilitators from the
Education Ministry, compared with none from the first cohort. The type of facilitation that took place online was
mainly question-answer types of interaction with the facilitators raising questions for the participants to answer. As a
result, the online forums served as a information exchange rather than as a discussion platform.
Unlike cohort 1, the top five active participants in cohort 2 spread across different groups and none of the
groups seemed to particularly coherent (Figure 4). CKH performed a similar role to JL in cohort 1 where he was an
active participant both in face-to-face discussion and the online group discussion.
18
16
14
12
10
8
6
4
2
0
e
T
hy
a
H
TC
P
H
KH
HT
T
LT C
N
y
LK
Y
KK
nc
TB
el
m
*D
TL
TL
LF
LJ
B
CW
ot
ng
C
re
P
m
*C
P
ny
im
w
Ji
LA
an
S
La
LM
LK
D
H
H
G
CS
326
e-Learning Resource Portal PDA Gadgeteers Group Sungei Buloh Group
Group
structures
Support and External vendors Own expertise Technical assistance from experts
tools Experience and working models of (TSC)
other schools Expertise of different members in
the team.
Division of labor
Discussion
This paper describes two attempts to foster communities of practice (CoP) among Heads of IT Departments
in Singapore schools employing Wenger’s tenets for evolving community as a general design framework. Through
the enactment of practice, by means of events/activities design by the researchers and having the participants work
in authentic tasks situated in their schools’ contexts, this study expected the participants to progress from peripheral
participation to central participation where they changed from passive observers to active contributors. The literature
is replete with studies and proposals such as: the characteristics of communities of practice (Barab & Duffy, 2000);
principles for the design of effective learning communities (Bielaczyc and Collins, 1999); and tenets of communities
of practice (Lave & Wenger, 1991). However, appropriating these ideas did not lead to sustainable communities
much past the end of the course.
The outcomes suggests to us that beyond the initial enthusiasm and willingness of the participants to learn,
problem solve and knowledge construct collectively, there may be some preconditions that need considerations
before designing interventions such as activities, support, and scaffolding for communities to prosper. Two main
types of preconditions are observed in this study – personal imperatives and nature of tasks.
First, personal imperatives such as a common belief, zeal and passion are observed to be important
considerations that connect and “gel” participants (in the case of the Primary Group) together. Coupled by the
presence of strong leadership (e.g. JL), membership in a CoP can grow to be closely knitted, reinforcing beliefs and
interactions inherent in the community. Such fortification eventually leads to identity change where individual
members spread and evangelize their beliefs and practice to other non members, thereby growing the community.
The nature of the tasks is o bserved to be the other type of precondition that is a determinant for
collaboration. Tasks that are not too specific, allowing participants to contextualize into their schools’ culture are
those that afford a greater degree of success. Notably these tasks also encompass a pedagogical shift, one that is
constructivist in orientation. Hence, we conclude that tasks that contain higher risks, in terms of uncertainly and
complexity in the processes and outcomes propel a greater need for collaboration and sharing among the
participants, rendering them to rely on one another for support and division of labor.
In conclusion, it is proposed that efforts towards the fostering of communities consider two main types of
preconditions, personal imperatives and nature of tasks before enactment. The personal imperatives that individuals
bring into the community determine the density of connectivity in a CoP which in turn affects the undertaking of
complex tasks.
References
Barab. S. & Duffy, T. (2000). From practice fields to communities of practice. In D. Jonassen & S. Land (Eds.).
Theoretical foundations of learning environments. Mahwah, NJ: Lawrence Erlbaum Associates.
Bielaczyc, K., & Collins, A. (1999). Learning communities in classrooms: A reconceptualization of educational
practice. In Reigeluth, C. M. (Ed.), Instructional-design theories and models. A new paradigm of
instructional theory (Vol. II). Mahwah, NJ: Lawrence Erlbaum Associates.
Hung, D. (1999). Activity, Apprenticeship, and Epistemological Appropriation: Implications from the writings of
Michael Polanyi. Educational Psychologist, 34(4), 193-205.
Lave, J. & Wenger, E. (1991). Situated learning: Legitimate peripheral participation. Cambridge: Cambridge
University Press.
Singapore’s Ministry of Education (2005). Minister speech at the MOE Work Plan Seminar 2005. Self-published.
http://www.moe.gov.sg/speeches/2005/sp20050922.htm
Wenger, E., McDermott, R., & Snyder, W. (2002). Cultivating communities of practice. Boston, MA: Harvard
Business Press.
327
Learning “Pragmatics” On-line Through Partnership: A Cross-Cultural
Study between Taiwanese College Students and Their Texan Tutors
Chia-Ning Jenny Liu
Texas A&M University
Zohreh Eslami-Rasekh
Texas A&M University
Abstract
This study investigated the effectiveness of the use of Computer-Mediated Communication (CMC) in
learning pragmatics. The impact of teaching pragmatics by E-mail and WebCT discussion on Taiwanese EFL
learners’ pragmatic competence was explored. Relative effectiveness of learning pragmatics through in-class
activities and telecommunication were also compared. Data collected in school settings were analyzed
quantitatively and qualitatively. The results showed that the pragmatic instruction enhance EFL learners’ pragmatic
competence.
328
planned classroom activities (Bouton, 1994; Eslami-Rasekh, 2005; Rose, 1999; Takahashi, 2001; Tateyama, 2001).
Studies conducted by Eslami-Rasekh (2004), Kasper (1997), Rose (1999), Takahashi (2001), and Tateyama (2001)
also suggest that pragmatic features can be effectively acquired through explicit instruction on pragmatics.
Over the past two decades, computers have become common instructional tools in the ESL/EFL
classrooms. Currently, collaborative e-mail exchanges are one of the instructional tools used in classrooms. Studies
have shown computer-mediated communication (CMC) has many merits in classroom settings. Computer-mediated
communication refers to interaction via telecommunications. Electronic communication has been found to have a
number of beneficial features that make it a good tool for language learning. Research has indicated electronic
communication can enhance students’ motivation (Warchauer, 1996), and improve writing skills (Cononelos &
Oliva, 1993). Cifuentes and Shih, (2001) further stressed that CMC provided an authentic context for learning
functional abilities by having EFL learners interact with English-as-a-first-language speakers. With explicit
instruction in how to communicate in the virtual environment, CMC may benefit the intercultural teaching and
learning (Shih and Cifuentes, 2003).
Objectives
This study investigated the impact of pragmatic instruction on Taiwanese EFL learners’ development of
pragmatic competence. Relative effectiveness of learning pragmatics through in-class activities and
telecommunication were also compared. The present study attempted to answer the following research questions:
(a) Did students who received the in-class explicit pragmatic instruction improve their pragmatic competence more
than those who did not do so? (b) Did students who received the explicit pragmatic instruction through
telecommunication connection from Texan tutors improve their pragmatic competence more than those who did not
do so? (c) What was the relative effectiveness of learning pragmatic through CMC as compared to in-class
pragmatic instruction? (d) What were students’ perceptions of learning pragmatics?
Methodology
This study applied a pretest-posttest control group experimental design and combinations of quantitative
and qualitative data collection and analyses. The independent variable was the treatment with three different
levels—(1) the control group which received no explicit pragmatics instruction, (2) the experiential in-classroom
group which received explicit pragmatic instruction face to face from their classroom instructor, and (3) the
experimental CMC group which received explicit pragmatics instruction from their Texan tutors through CMC (e-
mail and WebCT discussion). The dependent variables were students’ pragmatic competence.
Participants
Participants were 82 undergraduate students majoring in applied foreign languages from a university of
technology in Northern Taiwan. The other 13 participants were graduate students majoring in teaching English as a
second language at a university from Southern Texas.
In Taiwan, 82 students belonged to three intact classes and enrolled in the class of “English for Tourism.” Because
of institutional constraints, it was not possible to assign students randomly to the different groups, thus making it
necessary to work with three intact groups. In an effort to determine equivalence of the three groups in terms of their
English language proficiency, the General English Comprehension Test was given to the participants.
The statistics results showed that the control group produced higher mean scores on the reading comprehension
pretest (M=31.067, SD=8.183) than the experimental in-classroom group (M=28.957, SD=7.258) and the
experimental CMC group (M=28.966, SD=8.011). Nevertheless, three groups did not differ significantly from each
other in the performance of the reading comprehension pretest (F=0.68, df =2, p=0.51).
There were 30 students in the control group, 23 in the experimental in-classroom group, and 29 students in
the experimental CMC group. In Texas, each of the 13 graduate students was randomly assigned to be the tutor for
two or three Taiwanese experimental group participants. These students interacted with their Taiwanese learners
through email correspondences and WebCT discussion. All Texan participants received the instruction as part of
their curricular activities in the class.
Procedure
During the duration of this study (ten weeks), all eighty-two Taiwanese participants met once a week for
one hundred minutes each time. At the beginning of each class, the professor in Taiwan spent fifteen minutes in
dealing with class management and students affairs issues. Since the eighty-two Taiwanese participants were
enrolled in “English for Tourism”, participants in all three groups were engaged in the following warm-up tasks:
watching a short film about tourism in English for about fifteen minutes, followed by instructors’ explanation about
329
film for about twenty minutes in each meeting. The instructor used the text book entitled: “At your service: English
for the travel and tourist industry”. Each week, the instructor taught one unit of the textbook.
During the remaining fifty minutes of the class, participants in the control group did not engage in any
explicit pragmatics activities. Instead, the instructor spent about thirty minutes of lectures on learning tourism
English using the teacher’s manual as a guide, followed by twenty minutes of summary and discussion in each
meeting for a total of fifty minutes. During the thirty minutes of lectures, students had the opportunity to interact
with the instructor through questions and answers. And students also had small group discussion with their peers
during the twenty minutes of summary and discussion. Participants practiced their English in terms of writing,
listening, reading and speaking during class.
In contrast to the control group, during the remaining fifty minutes of the class, ten weeks lesson plans
were delivered to the participants in the experimental groups; i.e., in-classroom and CMC. Each lesson plan
consisted of one activity and each activity was designed in order to raise students’ pragmatic awareness. The content
for both groups are identical and was based on the ten weeks lesson plans developed by the researcher. The
components of the lesson plans aimed to raise students’ pragmatic awareness and offer learners the opportunity for
communicative practice. For the experimental-in-classroom group, the lesson plans were delivered by the instructor
through face to face mode; for the experimental –CMC group, the lesson plans were delivered though email
correspondences and Webct discussion between Taiwanese students and their tutors.
At the beginning of the study, all students were asked to complete the Discourse Completion Task (DCT)
pretest. Students in the control group received the regular classroom instruction that did not explicitly address
pragmatics in the teaching contents. The experiential in-classroom group received explicit pragmatic instruction
face to face from their in-classroom instructor, and the experimental CMC group received explicit pragmatics
instruction from their Texan tutors through telecommunication (e-mail and WebCT discussion).
Following ten weeks of treatments, all students were asked to take the Discourse Completion Task (DCT)
posttest. At the end of the study, the experimental CMC group took students’ perceptions of learning pragmatics
survey to explicate their attitudes toward learning pragmatics, attitudes toward using e-mail and WebCT in learning,
and perception of learning from Texan tutors.
The Discourse Completion Task (DCT) included twelve situations with a special focus on speech act
function of request. These situations were designed to probe how participants respond in different situation in terms
of social status, power, and impositions. The social contexts specified in the DCT contain relationships between a
professor and a student, a boss and an employee, and among friends. The purpose of this design was to see how
participants interacted or responded to certain situations from different points of view. Two native English speakers
rated the participants’ Discourse Completion Task (DCT) pretest and posttest productions. The rating system used
in this study was adapted from the rating system proposed by Hudson, Detmer, and Brown (1995), containing
components as the followings: (1) the ability to use correct speech acts, (2) expressions, (3) the amount of
information, (4) levels of formality, (5) levels of directness, and (6) levels of politeness. In this case, the last three
components were combined as one (levels of politeness) due to the overlapping elements of speech existing among
these three components. The raters rated participants’ performance based on 5 point rating scale ranging from 1 to
5. The value for interrater reliability was reached to an acceptable level of agreement (r .90).
Results
The descriptive statistics results of the DCT pretest scores by group are demonstrated in Table 1. There
were four scores, including the score of the ability to use correct speech act, the score of expressions, the score of
the amount of information, and the score of levels of politeness. The two experimental groups overall yielded
higher mean scores than the control group against three rating components (expressions, information, and
politeness). However, the experimental in-classroom group scored slightly lower in the speech act rating component
as compared to the relative means of the control group and the experimental CMC group. In this case, there was no
significant group effect for the DCT pretest; namely, three groups did not differ in their pragmatic abilities in the
speech act function of request prior to the treatment (F=2.131, df=2, p=0.126).
330
Table 1: Descriptive Statistics Results of the DCT Pretest Scores by Group
After treatments, the group comparison of the DCT posttest scores was conducted. The descriptive statistics results
indicated that there existed greater discrepancy among the group means. The performances of the experimental in-
classroom group and the experimental CMC group were better (surpassing from 2.7 points to 7.4 points) than those
of the control group in each of the four rating elements (See Table 2).
The repeated measures MANOVA results further showed that there was a significant difference among
three groups on the four means of the DCT posttest (F=16.35, df=2, p<.05). The results of Tukey Honestly
Significant Difference (HSD) Post Hoc test informed that the experimental in-classroom group and the experimental
CMC group both scored significantly higher than the control group on the DCT posttest; whereas, the students in the
experimental CMC group performed as well as those students who were in the experimental in-classroom group.
Moreover, an interaction effect of the group by the four rating elements of the DCT posttest was found to be
significant (F=2.93, df=6, p=0.009).
Figure 1 demonstrated the scores on expression yielded by three groups were significantly lower than the
scores of other components (speech act, information, and politeness). The students in the experimental CMC group
scored higher in the four rating components than the experimental in-classroom group, though difference was not
significant. The control group produced significantly lower scores in all four rating components when compared to
the two experimental groups. The experimental CMC group was found to have the highest mean scores on the
speech act rating component, revealing their superior ability to use correct speech act than other elements necessary
to make appropriate requests.
331
54
52
Estimated Marginal Means
50
48
DCT Posttest
46
speech act
44
expressions
42 information
40 politeness
control in-classroom CMC
GROUPS
Figure 1. The interaction effect of the group by the four rating elements of the DCT posttest
After ten week conventional pragmatic instruction, the experimental in-classroom group showed significant
improvement in their DCT productions. Overall, the experimental in-classroom group generated significantly higher
scores on the DCT posttest than the DCT pretest (F=11.156, df=3, p<.05); the means for each rating component on
the DCT posttest demonstrated an apparent increase, ranging from 1.37 to 2.62 points (See Table 3).
Table 3: Descriptive Statistics Results of DCT Pretest and Posttest Scores for the Experimental In-Classroom Group
Rating components Tests Mean Scores Standard
Deviation
Speech act pretest 45.70 5.65
posttest 48.32 4.83
Expressions pretest 44.09 4.70
posttest 45.82 4.61
Information pretest 46.45 4.72
posttest 47.82 5.01
Politeness pretest 47.09 4.65
posttest 48.73 5.49
Furthermore, as Figure 2 showed the scores on expression yielded by the experimental in-classroom group
remained the lowest scores whether on the DCT pretest or posttest. The experimental in-classroom group was found
to have the highest mean scores on the politeness rating component; that is, participants tended to show diverse
levels of politeness while making requests. Meanwhile, the mean scores on the speech act rating component
displayed the greatest improvement from the DCT pretest to the DCT posttest (MDCT pretest=45.70, SDDCT pretest=5.65;
MDCT posttest =48.32, SDDCT posttest=4.83). Other than that, the mean scores on expressions, the amount of information,
and levels of politeness also fairly improved after the treatment.
332
50
49
Estimated Marginal Means
48
47
46
rating components
speech act
45
expressions
44
information
43 politeness
pretest posttest
DCT
Figure 2. The DCT pretest and posttest scores by the four rating elements for the experimental in-classroom group
On the other hand, with ten week telecommunication connection to the Texan tutors, the experimental
CMC group learned pragmatics through e-mail and WebCT discussion. The repeated measures MANOVA results
showed a significant improvement of DCT productions for the experimental CMC group. Overall, the participants
in the experimental CMC group generated significantly higher scores on the DCT posttest than the DCT pretest
(F=47.897, df=3, p<.05); the means for each rating component on the DCT posttest increased ranging from 1.87 to 4
points.
Compared with the performances in the DCT pretest, the experimental CMC group produced significantly
higher scores (p<.05) on the DCT posttest in terms of four rating components. The mean scores on speech act
displayed the greatest improvement from the DCT pretest to the DCT posttest (MDCT pretest=48.45, SDDCT pretest=4.63;
MDCT posttest =52.45, SDDCT posttest=4.45) (See Figure 3). Other than that, the mean scores of the amount of
information, levels of politeness, and expressions also fairly improved after the treatment.
54
52
Estimated Marginal Means
50
Rating components
48
speech act
expressions
46
information
44 politeness
pretest posttest
DCT
Figure 3. The DCT pretest and posttest scores by the four rating elements for the experimental CMC group
333
Students’ Perception s of Learning Pragmatics On-line
Students from the experimental CMC group mostly expressed that the explicit pragmatic instruction indeed
helped them gain more knowledge in English pragmatics. Compared to the content presented in the conventional
English reading or writing textbooks, the content of this pragmatic instruction was more practical and useful for
their daily communication. For example, one student stated that he did not know the accurate meaning of “You
rock” until his Texan tutor explained it to him. “You rock” in the United States meant “You are so cool”, and this
student was pleased to learn more daily idiomatic expression in certain situations from his tutor. Another student
also mentioned that he was strongly aware of the differences between Chinese and English pragmatics from
communicating with his Texan tutor, and he regarded this learning experience as a valuable one because he hardly
had the opportunity to interact with foreigners.
Another student shared that the pragmatic instruction was beneficial to her for it helped her to make
appropriate requests in the airport while traveling aboard. This student looked forward to learning more content
with higher level of difficulty and more in depth because she thought the learning of pragmatics was quite useful and
important.
After ten weeks treatment, students in the experimental CMC group addressed that they enjoyed learning
pragmatics by presenting examples for them in the first place, and then until they completely understood the content,
they could use more examples to strengthen the concepts.
Nevertheless, more than half of the students in the experimental CMC group responded that some English
words and phrases used by their Texan tutors were not readily understandable; they had difficulty figuring out the
meanings of certain messages. Several students in the experimental CMC group expected their Texan tutors to be
more patient and affectionate.
Students also reflected that the content of pragmatic instruction could be more situational, more animated
with graphs, sounds, or short movies. A majority of students thought that the design of the content in the homework
needed to be improved. The classroom instructor or Texan tutors should avoid posting similar and ambiguous
questions every week, which bored the Taiwanese participants.
One student pointed out that he felt frustrated, and exhausted when asked to write and type short formal
essays independently per week, which was more like taking a formal serious English composition class. It was
hoped that the content could be displayed from the easier level to progressively go up to the difficult level.
Even so, most participants in the experimental CMC group were aware of the importance of pragmatics,
and they realized that English was not as difficult as they thought before. They felt this learning experience was
challenging, but interesting. It helped them gain more knowledge regarding Western people’s thinking patterns and
writing styles. In addition, they also felt more comfortable when they used English to perform requests in contexts.
Educational Significance
We found the pedagogical intervention had a positive impact on Taiwanese EFL learners’ development of
pragmatic competence from this study. With the appropriate classroom management and the Internet access of
computers, the students in the experimental in-classroom group and the experimental CMC group had the
opportunity to engage themselves in the process of learning pragmatics.
Additionally, computers functioned as “cognitive tools” for the experimental CMC group students to
reflect, refine, and assess their structural knowledge. The findings of this study urge educators to integrate
technology in helping Taiwanese EFL learners build up expertise in how to use English language appropriately, so
that they can develop the ability to comprehend and generate productive communicative acts.
It was apparent that the Taiwanese EFL learners did not naturally think and write in English. Accordingly,
the Taiwanese students required more time to process the English textual information and to respond in English. If
they were given more time on tasks, they might feel less concerned and threaten, and became more responsible for
their own learning.
Taiwanese EFL learners indeed need additional activities that can broaden their knowledge of pragmatics,
and provide a broader variety of models and opportunities for them to supplement classroom setting of learning. We
concluded that more complementary activities for EFL learners should be included in classroom settings, so that
they can be given the opportunity to gain pragmatic knowledge. When pragmatics is explicitly taught to second
language learners, they can acquire the essential skills faster (Bouton, 1994; Eslami-Rasekh, 2004).
References
Bialystok, E. (1993). Symbolic representation and attentional control in pragmatic competence. In G. Kasper & S.
Blum-Kulka (Eds.), Interlanguage Pragmatics (pp. 43-57). Oxford: Oxford University Press.
334
Bouton, L. F. (1994). Conversational implicature in the second language: Learned slowly when not deliberately
taught. Journal of Pragmatics, 22, 157-67.
Cifuentes, L., & Shih, Y. D. (2001) Teaching and learning online: A collaboration between U.S. and Taiwanese
students. Journal of Research on Computing in Education, 33(4), 456-474.
Cononelos, T., & Oliva, M. (1993). Using computer networks to enhance foreign language/ culture education.
Foreign Language Annuals, 26(4), 527-534.
Ellis, R. (1992). Learning to communicate: A study of two language learners’ requests. Studies in second language
acquisition, 14, 1-23.
Eslami-Rasekh, Z., Eslami-Rasekh, A., & Fatahi, A. (2004). Using metapragmatic instruction to improve advanced
EFL learners' pragmatic awareness. TESL- EJ, 8(2).
Eslami-Rasekh, Z. (2005). Raising the pragmatic awareness of language learners. ELT Journal, 59(2), 199-208.
Hudson, T., Detmer, E., & Brown, J. D. (1995). Developing prototypic measures of cross-cultural pragmatics.
Honolulu, HI: Second Language Teaching & Curriculum Center, University of Hawaii.
Kasper, G. (1997). Can pragmatic competence be taught? (Net Work#6) (HTML Document) Honolulu: University
of Hawaii, Second Language Teaching &Curriculum Center.
http://www.111.hawaii.edu/nf1rc/NetWorks/NW6/
Kasper, G. (2000).Four perspectives on L2 pragmatic development. Plenary address, Annual Meeting of the
American Association of Applied Linguistics, Vancouver, British Columbia, March.
Rintell, E., & Mitchell, C. J. (1989). Studying requests and apologies: An inquiry into method. In S. Blum-Kulka, J.
House, & G. Kasper (Eds.), Cross-cultural pragmatics (pp. 248-272). Norwood, N.J.: Ablex.
Rose, K.R. (1999). Teacher and students learning about request in Hong Kong. In E. Hinkel (Eds.), Culture in
Second Language Teaching and Learning. Cambridge: Cambridge University Press
Shih, Y. C., & Cifuentes, L. (2003). Taiwanese intercultural phenomena and issues in a United States-Taiwan
telecommunications partnership. Educational Technology Research and Development. 51(3), 82-90.
Takahashi, S. (2001). Role of input enhancement in developing pragmatic competence. In K. Rose & G. Kasper
(Eds.), Pragmatics in Language Teaching (pp. 171-199). Cambridge: Cambridge University Press.
Tateyama, Y. (2001). Explicit and implicit teaching of pragmatic routines. In K, Rose & G, Kasper (Eds.).
Pragmatics in language Teaching (pp. 200-222). Cambridge: Cambridge University Press.
Warschauer, M. (1996). Comparing face-to-face and electronic discussion in the second language classroom.
CALICO Journal, 13 (2&3), 7-26.
335
Different Does Not Mean Wrong:
Using Video To Help Pre-service Teachers Understand Diverse Families
Melissa Mohammed
Eleanor Ennis
Salisbury University
Background
One of the most cited predictions about public schools is that the diversity of K-12 student populations will
greatly increase (Sadker and Sadker, 2004). The school system most closely associated with Salisbury University is
the Wicomico County Public School System, and according to the community’s daily newspaper, the Latino
Population in Wicomico County increased 200 percent in the decade ending in 2000 (Gates, 2005; Carmen, 2003).
A related prediction is that the teachers of these students will continue to be much less diverse than their charges. If
these predictions come true, the possibility of miscommunication between student families and teacher is higher.
To avoid miscommunication, positive communicative skills should be taught and practiced at the pre-
service level. These communication skill are needed in the local region since our pre-service teachers tend to not
come from the constituent base of multicultural education, but from more privileged groups (Boyle-Baise, 2002).
Even when the pre-service teachers are people of color, they may have “limited direct experience with groups other
than their own, or perceive poverty from afar” (Boyle-Baise, 2002, p.16). As educators at a regional university that
graduates a large number of teachers, we wanted to address this area of need. We felt that the message that
“different does not mean wrong” could become a mantra for our students as they entered the workforce. We also
hoped this saying would emerge in practice as our more homogeneous students went out to serve the more diverse
communities.
One way to promote better assistance of homogenous professionals to diverse constituencies is through a
community-building project such as service learning. The case for service learning is a strong one, because it
combines several factors that can lead to student success. First, students read and make preparations to do work in a
non-university setting; then, they test out these ideas by actually meeting with and performing a service for persons
in the non-university setting. Finally, students reflect upon various aspects of what they did and learned. One
definition delineates the service learning process as “a credit-bearing educational experience in which students
participate in an organized activity that meets identified community needs and reflects on the service activity in such
a way as to gain further understanding of the course content, a broader appreciation of the discipline, and an
enhanced sense of civic responsibility” (Cameron, Forsyth, Green, Lu, McGirr, Owens, and Stoltz, 2001).
These are appreciable goals given the need for the involvement of pre-service teachers in the various ways
that fulfill the newer professional development schools model. The professional development school model requires
much more collaboration between the university faculty and the school faculty than the older field service model
did. As a result, the relationship between the two groups is “more complex and intertwined” than before, with the
resulting culture “transform[ing] both institutions and the personnel within each” (Book, 1996). Key within this new
culture is a resulting increase in student learning. The scope of this project was limited to undergraduate students.
Methods
This faculty development project was developed specifically for teachers of undergraduate pre-service
teachers. It began during a Summer Institute for Preparing Tomorrow’s Teachers to Use Technology (PT3). Institute
participants consisted of university faculty members. During the week-long institute, the two authors and an arts
and sciences faculty member created a project called, “Diversity in technology: Different does not mean wrong”. A
group consisting of a Computers in Education instructor, an elementary mathematics teacher and Math Methods
instructor, and an assistant professor of English, piloted this activity in the following fall.
Initially, each course instructor developed at least one activity for the project. Students in the English
generated the literature excerpts that were used for analyses in the Computers in Education and in the Math Methods
classes. The Computers in Education students interviewed parents and prepared videos of the interviews that were
shared with the Math Methods class. The Math Methods class engaged in role-playing through interviews and wrote
reflections of their interviews. Computers in Education and Math Methods classes also took a survey to measure
their attitudes about families from diverse backgrounds before and after the intervention. The goals of each course
instructor and more specific information about their activities follow.
336
The goal for English class was to have student groups research and collect literary excerpts from children’s
books and young adult novels written by women writers of color. The excerpts had to depict some aspect of a
teacher/multicultural student relationship. Each group had to find a novel suitable for middle-school students and/or
young adults that had been written by an ethnic writer who was not on the class syllabus. The groups were required
to find biographical information on the writer, provide a list of primary works written by the writer and secondary
works written about the writer, and distribute relevant excerpts from the novels that demonstrated some aspect of the
teacher /ethnic student interactions. For each ethnic group studied, two different groups presented their findings in
class and were responsible for sending the findings through e-mail to the Computers in Education instructor.
The goal for the Computers in Education class was to have pre-service teachers examine how to use
technology to improve the impact of teacher conferences for multicultural students. Moreover, the instructor wanted
students to learn how to use a technology designed to increase empathetic capacity. Students worked with a partner
to interview and videotape -- if permission was granted -- a family that represents a facet of the diverse community
in which the university student would be teaching. The elementary school students of the families chosen were in a
program for limited English proficiency (LEP). Finally, the college students wrote a reflection about the process
and the end project they created. Digital video still pictures from one project are included. In this sample the
students’ primary footage was damaged but they recovered very well by summarizing what they had learned in a
one-minute digital video.
These videos were shown to the Math Methods students. The goal for the Math Methods class was to help
pre-service teachers understand the historical perspective of diversity in mathematics class and the classroom in
general while increasing the students’ abilities to use online telecommunication tools. For example, students found
the PT3 section on the online course site (WebCT) and read at least three of the literature selections about
student/teacher interactions. The pre-service teachers then chose one of the selections and wrote a reflection about
the reading that considered the following perspectives: the student’s problem or situation and how the teacher
reacted to the child’s problem. These reflections were posted on WebCT under the appropriate selection so other
pre-service teachers could read and respond to their thoughts. As an alternative final, Math Methods students
conducted mock interviews; one pre-service teacher acted the role of the teacher, and another played the role of the
multicultural parent in a teacher/parent conference.
With pre-service teachers being present in each of these classes, the instructors hoped the university
students would learn from opportunity to communicate and learn about the needs of multicultural students and their
parents. The instructors also had the chance to use the multiple technologies of electronic mail, online course
management, and digital video for teaching about the diversity of families.
Results
The DDNMW project included a pre and post survey to measure student’s perceptions about LEP students
in their classrooms, and this data was collected through WebCT.
On both the pre and post survey 100% of the students expected to have LEP students in their classrooms.
In the beginning, only 65% felt prepared to teach LEP students and while the other 35% felt if they cared enough
about their students they would be able to reach them academically.
Improvement in recognizing strategies that are helpful for LEP students’ academic progress was evident in
the post surveys. All of these scores were improved from the pre surveys. These scores are displayed in the table
below.
337
The survey also showed that university students are aware that:
• math is not a universal language and LEP students need extra support
• students outside the mainstream culture of white America do not have the same opportunities to use
computers
• it is important for teachers to include contributions from all cultures even if there are no diverse students in
the class
not all Asian students are naturally good in mathematics.
In addition to pre and post survey information collected from Computers in Education and in the Math
Methods classes through WebCT, evidence of student learning could be seen in comments written in reflections.
Here are some samples:
From Computers in Education
I believe that the students and parents could grow and learn together in a way that I have never experienced.
From Mathematics Methods for Elementary and Middle School Teachers
[What I used to think about parent-teacher conferences is that they were] TO INFORM PARENTS ABOUT
STUDENT PROGRESS AND POSSIBLY DISCIPLINE PROBLEMS, [but now I realize] these conferences can
also be valuable tools in learning more about the students and their families, cultures, and lifestyles…
[What I used to think about parent-teacher conferences is] THE SAME, but now I realize that each conference is
unique and there must be a good deal of preparation by the teacher before each conference in order for the
conference to go well.
[What I used to think about parent-teacher conferences is that they were A REASON TO BE DEFENSIVE], but
now I realize that the teacher and the parents make a great team and should be comfortable working together to
better understand the child’ successes and strengths as well as challenges.
These students also made suggestions for professional development schools with high numbers of students
from families where English is not the first language:
• Have the teacher and student go to the back of room to personalize lesson
• Ask PTA to sponsor an English as a Second Language night, where communication problems are
addressed
Conclusion
The reasons for teaching future teachers more about communication with diverse groups are admirable,
plentiful, and sometimes difficult to explain to the university students. However, linking their work with a useful
purpose helped us meet the goal of increasing this type of communication.
Problems that we encountered are the same that other collaborative projects have faced: impracticality of
university students in different classes meeting face-to-face required an online solution (Greer and Hamill, 2003).
However, the ability to collaborate will enhance the skill of pre-service teachers to meet the needs of diverse
families.
While this study is not generalizable, it does contain many positive starting points for other teacher
educators to consider.
References
Book, C. (1996). Professional development schools. In Sikula, J., Buttery, T.J. and
Guyton, E. ( Eds)., Handbook of research on teacher education, 2nd ed. (pp. 194-210). New York: Simon
and Schuster International.
Boyle-Baise, M. (2002). Multicultural service learning:Educating teachers in diverse communities. New York:
Teachers College Press.
Cameron, M., Forsyth, A., Green, W.A., Lu, H., McGirr, P., Owens, P.E., and Stoltz, R. (2001). Learning
through service. College Teaching, 49 (3), pp.105-114 [Retrieved February 26, 2002 from
338
Ebscohost database].
Carmean Jr., J.E. (2003, November 20). Our advance. The Daily Times, p.A6.
Gates, D. (2005, May 9). Little Mexico. The Daily Times, pp. A1, A4 .
Greer, C.H. and Hamill, L.B. (2003). Using technology to enhance collaboration between special education and
general education majors. TechTrends, 47, (3), 26-29.
Sadker D.P. and Sadker, M.J. (2003). Teachers, Schools, and Society, 7th ed. New York: The McGraw-Hill
Corporation.
339
User Support Design to Provide a Chance to Learn
Momoko Nakatani
Masaru Miyamoto
Shunichi Yonemura
Human Interaction Procejct
NTT Cyber Solutions Laboratories
1-1Hikarinooka Yokosuka-Shi Kanagawa,
239-0847, Japan
Introduction
As a result of the increase in the popularity of Internet services, computers have been adopted by many
people. Many of the novices, such as stay-at-home moms, often enjoy net-surfing and exchanging e-mail with their
friends. However, they tend to use only a few simple functions and draw on other family members to perform more
complicated tasks, such as changing settings or troubleshooting. The computer manufacturers and Internet service
providers(ISP) have tried to make detailed manuals suitable for novices, but such manuals are seldom used. This
characteristic of novices was pointed out by Nojima in 1992 for a particular network software. He gave an account
of why novices failed to actively learn a new function; the presence of experts around them made it unnecessary for
them to learn.
The problem of with novice users is, however, increasing because of their number and the emergence of
complicated technical devices. It is extremely difficult to find experts who can fix the troubles of novices on a
continual basis. ISP call-centers are forced to deal with hundreds of calls from such novices everyday.
For example, when a novice wants to change her e-mail address, or to set-up an Internet connection in a
new PC, she may not be able to do it without help because the original setting was done by some other person. Also,
if the novice is viewing a web page in the offline mode and clicks a hyper-link in it, she‘ll not understand the alert,
“This page cannot be displayed offline”, and will immediately ask others for help. These troubles occur because
novices don't have any experience of solving problems by themselves. If they did, they would have gradually
acquired knowledge about the functions and mechanisms of the computer. For example, if they acquired knowledge,
such as ‘accessing a hyper-link requires that the browser be in the on-line mode’, they would be able to understand
the ‘off-line’ alert.
If novices had a better understanding of computers, they would not have to waste time asking others to
solve trivial troubles or bothering the experts. Moreover, they would be able to use more of the advanced functions
provided by modern computers.
Therefore, the purpose of this study is to support novices in troubleshooting and changing settings by
themselves.
Trouble-Based-Learning
Now, let’s consider why novices keep on relying on others rather than helping themselves. Carroll insisted
that learning stagnation occurs when the novice prioritizes his immediate task over the acquisition of knowledge
(Carroll, 1987). The computer is seen as just a tool to achieve the immediate task and the novice fails to recognize
that additional knowledge would be useful in solving any problems that may occur. Tsunoda, on the other hand,
noted that novices don’t learn due to not only the novices themselves, but also their environment (Tsunoda et
al.,1990) which is similar to the assertion of Nojima (1992). He interviewed some word-processor users about
common usage patterns and where they learned about its functions. He found that people are not always active
learners; the incidents of accidental learning are significant. He suggested that if we provide novices with an
environment in which to learn, their passivity might change.
This study builds on prior work and provides novices with an environment in which they can learn from a
trouble when they access the call-center of the ISP; we determine if they change their attitude and adopt a more
active learning stance. The users are encourage to learn from a trouble by introspection and acquiring deeper
knowledge; we call this “trouble-based-learning”. PC study courses are based on a prepared curriculum. Our
approach is based on learning in response to troubles, so a curriculum may need to be created for each trouble.
Trouble-based-learning is expected to enhance the user’s experiential and introspective learning. Moreover, they
may have more motivation to learn after experiencing some trouble compared to the usual curriculum-based PC
lessons. In the next section, I will introduce the current call-center approach, and how we can realize trouble-based
340
learning in a call-center.
Operator User
Operator is the brain. User is the eye, ear, and hand of the operator.
To establish trouble-based-learning in the call-center, we extend the call-center support to give the user
additional explanation about the system and the procedure. The additional explanation can be given orally or
through e-mail or fax. We visualize the procedure and the mechanism of the Internet, which we believe will trigger
self-reflection with regard to the trouble.
We conducted two initial experiments as will be introduced in the following section.
Experiment 1
Method
This experiment observed the changes in how novices tackled an Internet connection problem with and
without trouble-based-learning. Its purpose was to investigate practical problems in trouble-based learning. We
divided 20 women (ages 20-50) into two equal groups: TS(=Test subject) and CS(=Control Subject). The situation
was verbally explained to the subjects; “Assume that this is your own house. A family member has altered web
browser setting so that browsing is no longer possible.”
The experiment proceeded as follows:
1. Subjects were told to find and rectify the setting error. An electronic instruction manual was provided for
the subject to access for self-troubleshooting. The subject was allowed to call the call-center if needed.
2. The call-center operator solved the problem across the phone. For each CS, the operator simply told
them “what to do” to solve the problem. Each TS, on the other hand, was additionally told “how” to tackle similar
problems at the end of the call.
Each subject performed 4 trials and the problem in each trial was slightly different; in two of the trials the
trouble was with an IP-phone, and in the other two the trouble was that a webpage could not be seen. After all the
trials, we carried out interviews to collect some subjective evaluations. We also compared how long each subject
341
tried to fix the problem before calling the call-center (trial time).
Result
TS Satisfied Because the problem was solved smoothly(4)
(8) Because the operator told me “why”(2).
Because the operator told me slowly and carefully.(2)
Unsatisfied Because I’m tired of listening to the repetitions of the explanations.(1)
(2) Because I couldn’t fully understand what the operator told me.(1)
CS Satisfied Because the problem was solved smoothly.(5)
(7) Because the operator told me slowly and carefully.(1)
NA(1)
Unsatisfied Because I just followed the instruction by the operator and couldn't understand the
(3) reason for the operation.(2)
Because the communication with the operator wasn’t long enough.(1)
Table 1. Answers to the question asking whether the subject was satisfied with the support or not, and why.
Table 1 shows the answers to the question "Were you satisfied with the support provided or not". The
numbers in the parentheses indicate the number of subjects. It shows that most of the subjects in CS and TS were
satisfied with the support provided. We can say that most users feel a certain amount of satisfaction if the problem is
solved. Some of the subjects were concerned about the “kindness offered” or the “speed” of the solution. A
distinctive trend for the TS was that “understanding” was one of the reasons for the satisfaction. This satisfaction
included excitement and happiness due to discovery such as “(subject A) The explanation of the mechanism made
me feel an affinity with the system though I wasn’t interested in it before”, or “(subject B) I ignored the structure of
the Internet up to now, but I learned about it for the first time. I feel that I have gained a little more knowledge!”.
In response to the interview responses, we focused on the change in the attitude of the subjects who stated
that their satisfaction was due to “understanding”.
In the first trial, subject A immediately called the call-center when she faced a trivial trouble; she couldn’t
find an icon indicated in the manual in the chapter “DNS setting”. However, the operational error was not in the
setting operation, but the setting of the proxy. In the second trial, she couldn’t log the modem in and immediately
called the call-center again. She was unable to execute a particular operation indicated in the manual again; the error
she had to rectify involved the wiring of the LAN cable. In the 3rd and the 4th trial, however, she tried to tackle the
problem even she couldn’t follow the particular page in the manual; she turned to other pages in the manual, and
tried other operations she had not tried before. For the last trial, she even search for the “windows control panel”,
which is not described in the manual. Although she couldn’t achieve the goal by herself in both trials, she tried for
more than 10 minutes to tackle the problem, whereas she only tried 2.5 minutes in the first trial, and 8 minutes in the
second.
Subject B in the first trial first checked the cable wiring, and then, tried to log the modem in. After a
moment, she called the call-center because she wasn’t able to log the modem in. In the second trial, she couldn’t find
an icon indicated in the manual, the same as the first trial of subject A. In the 3rd and the 4th trials, however, she
changed the strategy to find the error. She searched widely in the manual first to figure out where the problem was
and what to try. As a result, she successfully found the appropriate page in the manual, and could exactly solve the
problem by herself. In the 4th trial, she chose the wrong page in the manual, and stuck at executing what was written
in the manual. However, although she noticed that she could not follow the manual, she didn’t give up, and searched
for the other page in the manual, found the appropriate page, and succeeded in rectifying the error by herself.
Subjects A and B advanced their skill and their attitude seems to have changed through the trials. This trend was
duplicated by other subjects who stated that “understanding” yielded satisfaction.
The subjects who were unsatisfied, on the other hand, didn’t show such change. For example, subject C
who stated “I am tired of listening to the explanation”, called the call-center even faster in the 4th trial than in the 1st
trial. This result shows that even if we provide the same information, the attitude of the novices might not be the
same. The key to their attitude may be the feeling of satisfaction that comes from understanding.
342
Figure2. Transition of the trial time
Next, to elucidate the overall trends in CS and TS, we measured the length of time spent before they called
the call-center. Figure2 shows the proportion of the length of each trial to the length in the first trial. As we can see
from the figure, the length of the TS members varied in the trials. The subjects who felt happiness due to an increase
in understanding had long lengths, while the lengths of most of the subjects who were not satisfied with the support
decreased. On the other hand, all CS members showed a uniform decrease in length. The motivation of the CS
members to conduct troubleshooting seemed to decrease as the experiment progressed.
The CS members experienced the current call-center support which makes the user operate without
understanding the meaning of the operation, and we can say that this support would lead to a stronger reliance on
others. It’s to be noted that it is not clear whether the user really understood the meaning of the operator’s
explanation or not (TS), but the “feeling of understanding” may lead the user to the next step.
Experiment2
Operating a computer without knowing what they are doing is said to induce cognitive anxiety (Kaiho,
1991). We assume that to correct the novice’s attitude, we must provide an environment where the user can
eliminate these feelings of anxiety. The subjects who stated that their satisfaction was due to an increase in
“understanding” in the first experiment, may cast aside their anxieties which leads to a change in attitude.
The second experiment focused on how trouble-based learning impacted the user’s anxiety and motivation
in tackling problems.
Method
Another 20 women (ages 42-57) were divided into two equal groups: TS and CS. We adopted the IP-TV
phone as the material and the subjects who were novice computer users and who had never used IP-TV phone before
to make the experiment condition uniform. Four trials, each with a slightly different problem, were conducted by
each subject. The experiment proceeded as follows:
1.Setup phase: The call-center helped the subject to setup an IP-TV phone over another phone. Each TS
was told of the system mechanism together with the meaning of each step of the procedure at the end of the call.
Each CS, on the other hand, received no additional explanation.
2.Troubleshooting phase: The subject was told to find and rectify an operation error (voice output was
muted) made by a family member. An electronic instruction manual was provided for the subject to access for self-
troubleshooting. The subject was allowed to call the call-center if needed.
3.Second troubleshooting phase: Basically the same setting as the first troubleshooting phase, except that
the trouble was that both speech and video output were blocked.
To investigate motivation, we asked 44 questions that fell into 3 groups, computer anxiety, confidence in
343
using IP-TV phones, and interest in IP-TV phones, at the beginning and at the end of the experiment. Also, a
question was asked to determine whether the subjects felt difficulty in using IP-TVphones or not.
Results
TS increased, (2) constant, (2) reduced, (5) TS reduced, (2) increased, (8)
redu- cons-
CS increased, (3) reduced, (7) CS ced, tant, increased, (7)
(1) (1)
0% 20% 40% 60% 80% 100% 0% 20% 40% 60% 80% 100%
redu- constant,
TS reduced, (2) increased, (8) TS ced, (2) increased, (7)
(1)
0% 20% 40% 60% 80% 100% 0% 20% 40% 60% 80% 100%
The results show that for most subjects in both groups, computer anxiety fell, confidence in using IP-TV
phones increased, and as did interest in IP-TV phones (Fig3). This suggests that repeated troubleshooting increases
the user’s motivation regardless of any additional explanation that may be provided for most of the subjects.
This result seems to be inconsistent with the previous experiment, but a more detailed examination of the
results included the interesting suggestion.
(1)Computer-Anxiety
Answers to 1 item differed significantly(p<0.01);ambiguous computer anxiety (e.g. “I stay away from computers as
I am afraid of them”) was reduced for CS members. For TS members, the 3 items that differed significantly
indicated that more concrete anxieties (e.g. ”I feel difficulty in understanding the technical aspect”) involving
technical hurdles were diminished.
(2) Confidence in using IP-TV phone
For CS members, 4 items differed significantly before and after the experiment: all the items suggest an unwarranted
increase in confidence (e.g. “I can use an IP-TV phone without any help”). For TS, on the other hand, 6 items
differed significantly (p<0.05 for 5 items, p<0.01 for 1 item), all indicating a valid increase in confidence in a
limited technical situation(e.g. ”I can use an IP-TV phone if I have a manual”).
(3) Interest in IP-TV phone
For CS, one item: “I want to be able to setup an IP-TV phone by myself”, increased significantly (p<0.05) due to the
experiment. For TS, one item:” I want to develop a new way to use an IP-TV phone”, increased significantly
(p<0.05).
Looking at the above mentioned 3 results, we can extract a strikingly similar tendency: the TS group
experienced a reduction in concrete anxiety about technical matters and an increase in confidence in specific
technical situations unlike the CS group. CS members, on the other hand, experienced a reduction in vague anxiety
and an increase in confidence but in an ambiguous way.
We assume that if a vague anxiety (such as I am afraid of computers) can be changed to a more concrete
anxiety (I want to be able to set the browser) that can be resolved, the subject gains more confidence in her own
ability. We believe that this would lead to enhanced motivation and more over, further exploration.
In addition to the above result, the TS members felt less difficulty in using IP-TVphones (p<0.01), unlike
the CS members. This reduction of the feeling of difficulty, which is related to the understanding of the system, may
lead to effect the above mentioned change in the feelings of TS members.
344
However, we think that these suggestions should be confirmed in subsequent experiments. The feeling of
anxiety is not so simple, so we assume that we will have to observe each subject in more detail for a longer span and
precisely identify the actual changes in their feeling.
References
Carroll, J. M., & Rosson, M. B. (1987). Paradox of the Active User, In J. M. Carroll(Ed.), Interfacing thought:
Cognitive Aspects of Human-Computer Interaction. Cambridge, MA: MIT Press.
Kaiho, H. & Harada, E. & Kurosu, M. (2000) in Cognitive Interface(7th ed.). (in Japanese). Shin-Yo-sha
Nojima, H., (1992) The Role of “Others” in Using Computer Networks (in Japanese). Advances in Japanese
Cognitive Science 5,49-71
Tsunoda, S. & Miyake, Y. (1990). Learning Stagnation Using Word Processor, Proceedings of 6th Human Interface
Symposium (in Japanese), 79-84
345
The Interplay Between Instructor Beliefs about “Best Practices” in Teaching
and Actual Practices in Online Learning: A Case Study in Higher Education
Bessie Nkonge
North Carolina A&T State University
Introduction
The focus of this paper is to describe the relationship between instructor beliefs about “best practices” in
teaching and their actual practices in the online learning environment. This report is a subset of a larger study
conducted in an institution of higher education to explore online instructor pedagogical perceptions, beliefs and
practices.
Literature on cognition and behavior suggests that people tend to behave in ways that are consistent with or
support their beliefs. From this research, it may be assumed that instructors do not engage in classroom practices
that contradict their beliefs about how best to teach. However, the nature of online learning may challenge them to
modify and adopt new teaching philosophies that are more aligned with that environment. Furthermore, the
discrepancy between beliefs and actual practices may be attributed to the instructors’ ability to use the technology
effectively, or the perceived capacity of the technology-mediated environment to support some of the instructor
beliefs about “best practices”.
Research Questions
The following questions guided this study:
1. What are faculty beliefs about “best practices” in teaching?
2. What are the patterns of pedagogical practice among faculty in Web-based instruction?
3. What is the degree of congruency between faculty beliefs about teaching, and their actual teaching
practices in Web-based instruction?
346
believe to be “best practices” in teaching (Becker & Ravitz, 1999).
In a study on technology adoption, in which exemplary practices in the field were considered, Lan (2001)
stated:
Although accessing the Web is relatively easy, learning to harness its full potential is not so simple.
Integrating technology into the classroom requires a clear vision and identifiable goals. For faculty to share the
vision and goals, they must perceive the vision and goals to be relevant to their discipline and profession, valuable to
their practice, and reasonable to pursue. In addition, incentives such as technical and pedagogical support must be
present to sustain the faculty culture of innovation. (p. 393)
In higher education institutions with limited resources, the rapid expansion of Web-based teaching may
have created a pedagogical challenge with more questions than answers. To teach in the evolving environment,
instructors are struggling with issues of designing Web-based courses for different learners; redefining their role in
the design, development, and implementation of online instruction as well as identifying the most effective teaching
and learning strategies using different technologies. The nature of online instruction forces instructors to undergo
personal transformations. Research indicates that instructors undergo changes in their philosophy and approach to
teaching as a result of their participation in delivering Web-based courses (Brown, Cremer, & Frank, as cited in
Jaffee, 2003). Teaching online requires that instructors reflect more on their practice. In their quest to become more
effective online instructors, some of the instructors’ beliefs and/or practices may change. Besides the rigor of
redesigning their courses for the online environment, instructors have to rethink some, if not all, of the teaching
strategies that they employed in the traditional classroom. Both teacher and student roles have to be carefully
reexamined and refined for the virtual environment. The transition may take time, as some may struggle with the
traditional ‘sage-on the-stage’ style of teaching, while at the same time trying to master the use of new technologies.
The challenge, therefore, is how to incorporate active learning tasks including discussions, assignment exercises,
and group projects that draw students into the learning process.
As a lens for evaluating learning in general, higher education institutions have adopted the “Seven
Principles of Good Practice in Undergraduate Education” originally published in the AAHE Bulletin (Chickering &
Gamson, 1987). New communication and information technologies have been developed since, and are reflected in
the article “Implementing the Seven Principles: Technology as Lever” by Chickering and Ehrmann (1996), who
revisited the “Seven Principles” and highlighted some of the most appropriate ways to use computing technologies
to advance higher education.
As is the case with the traditional classroom, the online classroom imposes constraints on certain teaching
practices such as frequency of communication, assignments, online chats, and grading. The instructors’ pedagogical
beliefs influence the learning outcomes to the degree that they are incorporated into the teaching practices.
Methodology
Following Yin’s (2003) case study methodology, data relating instructor pedagogical beliefs about “best
practices” and actual practices were gathered and analyzed.
The research was conducted in a medium-sized institution of higher education and involved eight
instructors teaching fully online classes. The instructors’ backgrounds were diverse in terms of online teaching
experiences and academic disciplines. Data were gathered through semi-structured interviews, class ‘observations’
and analysis of course artifacts such as syllabi, lecture notes, assignments and discussion threads. The researcher
compiled reflective notes, which were used to support the effort of triangulation by either corroborating or refuting
the evidence from different data sources, thereby strengthening the study. The use of multiple sources of evidence
promoted the development of converging lines of inquiry (Yin, 2003). The data gathered provided support for the
research questions guiding this study.
347
encouraged to submit quality work, and be granted multiple opportunities to practice to acquire subject matter
mastery.
Assigning authentic projects – Learners become more engaged in the learning process if they are working
on meaningful projects. Furthermore, when interesting projects are assigned, learners put more effort and energy to
succeed in the course. Assignments should include hands-on activities, simulations, and problem solving tasks.
Flexibility and Structure – Some instructors believe that flexibility is an essential option in distance
learning and should, as far as possible be incorporated in the learning process. Structure should be provided for
learners who need it, and is also necessary to advance learning from one unit to the next. Learners should be given
opportunities to be creative and design their projects or other assignments as long as the work was tied to course
objectives.
Prompt feedback – As a subset of communication, the instructor should provide timely feedback and show
interest in learners’ progress. The strategy reduces anxiety about the course and allows learners to gauge their skills
at each step.
Teamwork – Viewed as a critical skill for today’s workplace, online learning should involve group projects
in which collaboration and cooperation are encouraged and rewarded.
Relationship Between Beliefs and Actual Practices Of the eight participants, six of them demonstrated a
high degree of congruency between their beliefs about “best practices,” and actual practices. In considering the
theoretical framework of constructivism as the underpinning theory for effective teaching and learning, it appears
that all the participants incorporated constructivist practices in their classes but in different ways. The general tenet
of constructivism revolves around “knowledge construction” through such things as active engagement of the
learners in producing meaningful artifacts that have relevance to them, problem solving, and support for divergent
thinking. The learner is also the focus of the educational process.
A closer scrutiny of the levels of congruency across all cases revealed that one participant (Brenda), was
perhaps the most congruent in her beliefs and practices. Although she did not define herself as a constructivist, she
believed in student-centered individualized instruction, assigning work that required reflective thinking, and
granting students multiple opportunities to produce acceptable work through practice, all of which confirm a
constructivist teaching style. She viewed technology as “enhancing” and supporting the ideal practices that she had
always believed were best suited for teaching in her discipline. As she summed it: “I do consider myself lucky in
that I was able to line up what I do with a medium that has allowed me to do it, I think, better, more effectively.”
Another participant, Morgan, who labeled herself as both constructivist and behaviorist, and viewed her
practices as largely dependent on the environment in which she works, was more like Brenda in terms of the
congruency between her beliefs and actual practices in teaching. She was realistic about the composition of a typical
class where diversity in learners is the norm rather than the exception. To that end, she adopted teaching practices
that were both constructivist and behaviorist depending on the approach that best suited individual learners. Her
actual classroom practices were very much in line with what she set out to accomplish, allowing students to choose
projects that interested them, encouraging interactivity and cooperation, urging students to question their views in
threaded discussions, and providing structure and close guidance to those who needed it. Morgan viewed teaching as
very complex with no easy answers. The lens through which she assessed her beliefs and practices led her to
conduct her classes in ways that were aligned with her fundamental beliefs about the importance of understanding
the context where learning would take place. Although Morgan stated that she did not abandon her beliefs, she was
well aware that at times, her beliefs and practices were in conflict primarily because, occasionally, she had to alter
her practices in order to accommodate her students’ needs. For example, she felt that there are too many unrealistic
standards in the education curriculum that stifle rather than inspire excellence. Her students, who are mainly full-
time teachers, were not given time off from work to attend classes. They were always pressed for time, and had little
room to practice what they learned. Instructors also do not often have the time to reflect on how they are teaching.
As Morgan stated, “You can have your own set of beliefs (philosophy), but the world in which you operate makes a
great difference in the way you actually practice.” It would therefore be unreasonable to judge this incongruity
negatively.
A common thread among Allen, Lauren, and Sally is that they viewed themselves as having undergone a
transition from behaviorism to constructivism. As with all other participants they incorporated constructivist
practices such as hands-on activities where artifacts were produced, involved students in problem solving and
knowledge exploration beyond the textbook. The participants’ emphasis on structure, particularly Lauren and Sally,
appeared to contradict their constructivist philosophy, but in essence, it did not. Structure was necessary to move the
class along, and gives some learners a roadmap of how to proceed in the course. These individuals had no
inclination to venture beyond the syllabus or take advantage of the flexible options available to them to accomplish
348
course objectives. Other constructivists such as Morgan, Walter, Victor and Sam enforced structure when necessary
for similar reasons.
Sam and Walter both believed that delivering quality education was the guiding factor in their online
courses. Although neither one of them defined themselves as constructivists, their teaching practices proved
otherwise. It appears that they incorporated more extensive constructivist activities in their classes than did Allen,
Lauren or Sally who fervently described themselves as constructivists. They created a truly challenging and
engaging environment for their students where many student-centered activities were incorporated. These included:
(a) field trips, (b) simulations, (c) on-site work experiences, (d) hands-on activities to create artifacts, (e) guest
speakers in the virtual chatroom, (f) multimedia presentations, and (g) collaborative projects. The activities
accentuated Sam and Walter’s commitment to students’ learning through exposure to a variety of experiences and
resources for building knowledge. Apart from some difficulties relating to downloading information or navigating
the course due to design flaws, the guiding principle of delivering quality education was upheld in Sam and Walter’s
classes.
The research also revealed that instructors do not always practice what they say. An analysis of course
artifacts showed that incongruity existed between the stated beliefs and classroom practices. In most instances, the
instructors were not aware of the discrepancies between their beliefs and actual practices. Some instructors,
however, consciously modified their actual practices to adapt more readily to the learners’ needs, although their
basic beliefs dictated otherwise. Their classroom practices were not only influenced by their beliefs, but also by the
unique qualities relating to their disciplines, the learners, as well as the overall “ecology” of the online learning
environment.
Lauren, perhaps, appears more incongruent in her beliefs and practices than either Sally or Allen. She
spoke passionately about the importance of group interactivity but in reality, did not assign any collaborative work.
She also spoke about her belief in understanding students’ prior knowledge in order “to get them from where they
are to where you want them to be in the content.” However, there were no instances of background knowledge
probes prior to introducing new materials or discussion topics (see Tables 1 and 2 for samples of congruency and
incongruity between beliefs and practices).
Allen did not participate in the online discussions as he purported to do. In addition, he failed to honor
some of his own protocols such as providing feedback and direction for the discussions. The learners never engaged
in mutual discourse at any time, as they only responded directly to instructor-posted topics. All formal dialogue
aimed at encouraging learner-learner interaction and knowledge exchange never materialized. Moreover, there was
an absence of intervention by the instructor to correct the problem.
Some of the inconsistencies between beliefs and actual practices can be explained by instructors’ limited
awareness of the capabilities of the technologies available. For example, Sam believed in assigning group projects
and collaboration among students, yet he failed to utilize the ‘Groups Management feature in the Course
Management system which would have facilitated the process. Thus, a major component of his teaching was not as
successful as he had wanted it to be.
Victor stands out as the only participant who neither rated the congruency between his beliefs and practices
highly nor defined himself as a constructivist. However, following in the footsteps of the other participants, he did
adopt constructivist practices such as problem solving, engaging students in authentic projects and personalized
assignments. It was evident that there was a conflict between what Victor wanted to accomplish and ultimately what
he was able to in his classes. He attributes this to the following: (a) limitations in his own technical skills, (b) the
time he was willing to devote to designing ideal courses, and (c) the current capabilities of available technology.
One of his disillusionments with online learning is the inability to replicate real-life experiences especially in
technical disciplines. Based on these subjective observations, Victor concluded that some of his perceptions of “best
practices” were currently not fully actualized in practice. By his own assessment, the level of congruency between
his beliefs and practices was therefore realistically only 75%.
349
Table 1: Sample Elements of Congruency Between Instructor Beliefs and Practices
Element Example
Student-centered “I think a good teacher - in my area, … would need to be
prepared to meet student where he or she is…”
Problem solving “I tell them where the breakdown has occurred and invite them
to go ahead and find what is wrong . . .” (Brenda)
Active learning “In your opinion, of the educational philosophies presented in
this chapter, which one best represents your educational
philosophy and why?” (threaded discussion topic – Allen
Encourage communication “I was hoping other students would have replied to Candice’s
post by now. I chimed in only because I want to make sure
y’all have the information you need to get your work done on
time. …I encourage you to …build personal relationship with
each other. However, don’t forget that you need to post 20
‘meaningful/quality’ posts…” (Lauren
Encourage creativity “Pick one visual situation with good potential for
interpretation…It should however, appeal to you in either a
positive or negative way (so it can lead you to write at least
200 words about it).” (Morgan)
Prompt feedback “For those who sent and posted information early, your
evaluations have been posted in the gradebook. For the
remainder of the class, posting for Unit 3 will be as usual on
the Tuesday following the close of the Unit” (Sally)
Quality work “It should be understood that this course requires extensive
work and input from each student… It is expected that the
requirements for this course must be done to graduate level
quality.” (Walter)
Element Example
350
materials. Sam’s own reference to “high standards” was not
related to the course design, but the discrepancy between the
design and the content that he wanted to convey was well worth
noting.
Role as a teacher/communication Unlike many of the participants, Sally did not view her role as
that of a facilitator, but a dominant, controlling authority on the
conduct of her online classes. Although she diligently worked
on the quality of online discussions, she was only a passive
participant in them, with only an occasional intervention to
recognize outstanding work. Students essentially took over the
control and direction of the discussions, thus nullifying the
importance of instructor involvement all aspects of the class
activities.
Blank slate theory Lauren stated, “I don’t believe in that blank slate theory.
Students come in with experiences. You need to connect with
those experiences to get them from where they are to where you
want them to be in the content.” However, there was no
evidence of conducting background knowledge probes prior to
introducing new materials or discussion topics.
The underlying justification given by the instructors for the degree of congruency between their beliefs
about “best practices” and actual practices was that their beliefs essentially guided their teaching behaviors in the
classroom. Some of the instructors stated that they needed to model their beliefs by aligning them closely with their
practices. Others described undergoing an evolution in their beliefs and/or practices as they matured as instructors.
For most, their beliefs shaped their practices but for others their practices shaped their beliefs. Some instructors
viewed technology as essentially supporting changes in their teaching practices that were already aligned with their
beliefs. They viewed technology as an asset, offering learners greater opportunities for communication,
collaboration, cooperation, and increased time on task. Some of the instructors who preferred teaching in the
traditional classroom recognized the advantages afforded by the online technologies including the innovative and
meaningful alternatives that supported their teaching practices.
Conclusion
Beliefs remain the best way to predict instructor classroom practices and may lead to discovery of
discrepancies between what instructors say they do and what they actually do. Instructor dissatisfaction with online
learning environments could well be attributed to the lack of perceived congruency between beliefs and practices.
Since instructors wish to accomplish no less in the distance learning courses than in the traditional courses, it is
important for institutions to understand whether pedagogical ideals are being realized in the technology-driven
learning environment. The necessary steps can then be taken to bridge the gap between teaching ideals and actual
practices.
Research in this area is useful for bringing desirable changes in instructional practices or beliefs, and for
promoting more cohesion between beliefs about “best practices” and actual practices where desired. The subject of
pedagogical beliefs and their relationships to teaching practices in the online environment have not been thoroughly
addressed in the context of higher education.
References
Barab, S. A., Hay, K. E., & Duffy, T. M. (1998). Grounded constructions and how technology can help. TECH
TRENDS, March, pp. 15-23.
Berg, G. A. (2000, June). Early patterns of faculty compensation for developing and teaching distance learning
courses. Journal of Asynchronous Learning Networks, 4(1).
Becker, H. L., & Ravitz, J. (1999). The influence of computer and Internet use on teachers’ pedagogical practices
and perceptions. Journal of Research on Computing in Education, 31(4) 356-385.
Chickering, A. W., & Gamson, Z. (1987). Seven principles of good practice in undergraduate education. AAHE
Bulletin, 39 3-7.
Chickering, A. W., & Ehrmann, S. C. (1996). Implementing the seven principles: Technology as lever. AAHE
351
Bulletin, October, pp.3-6.
Clark, C. M., & Peterson, P. L. (1985). Teachers’ thought processes. In M. C. Wittrock (Ed.), Handbook of research
on teaching (pp. 225-296). New York: McMillan.
Cuban, L. (1993). How teachers taught: Constancy and change in American classrooms: 1890-1990 (2nd ed.). New
York: Teachers College Press.
Czerniak, C. M., Lumpe, A. T., Haney, J. J., & Beck, J. (1999). Teachers’ beliefs about using technology in the
classroom. International Journal of Educational Technology, 1(2). Retrieved June 22, 2002 from
http://www.outreach.uiuc.edu/ijet/v1n2/czerniak/index.html
Ertmer, P., Addison, P., Lane, M., Ross, E., & Woods, D. 1999). Examining teachers’ beliefs about the role of
technology in the elementary classroom. Journal of Research on Computing in Education, 32(1), 54-72.
Fullan, M. (1991). The new meaning of educational change. New York: Teachers College Press.
Freitas, F. A., Myers, S. A., & Avtgis, T. A. (1998). Students’ perceptions of instructor immediacy in conventional
and distributed learning classrooms. Communication Education, 47(4), 366-372.
Feenberg, A. (1999, Winter). Distance learning: Promise or threat? Crosstalk.
Gess-Newsome, J. (1999). Teachers' knowledge and beliefs about subject matter and Its impact on instruction.
Paper presented at the Annual Meeting of the National Association for Research in Science Teaching,
Boston, MA. (March).
Hara, N., & Kling, R. (2000). Students’ distress with a Web-based distance education course: An ethnographic study
of participants’ experiences. Retrieved January 30, 2003, from http://www.slis.indiana.edu/CSI/wp00-
01.html.
Jaffee, D. (2003). Virtual transformation: Web-based technology and pedagogical change. Retrieved June 3, 2003,
from http://www.unf.edu/~djaffee/virtualtran.htm.
Loucks-Horsley, A., & Steigelbauer, S. (1991). Using knowledge of change to guide staff development. In A.
Lieberman & L. Miller (Eds.), Staff development for education in the ‘90s (pp. 15-36). NewYork: Teachers College
Press.
Olcott, D., Jr., & Owston, R. D. (1997). The World Wide Web: A technology to enhance teaching and learning?
Educational Researcher, 26(2), 27-33.
Pajares, M. F. (1992). Teachers' beliefs and educational research: Cleaning up a messy construct. Review of
Educational Research, 62(3), 307-332.
Powers, S. M., & Mitchell, J. (1997). Student perceptions and performance in a virtual classroom environment
(Report No. NCRTL-RR- 918-506). East Lansing, MI: National Center for Research on Teacher Learning.
(ERIC Document Reproduction Service No. ED 409 005).
Saba, F. (1999). Planning for distance education: Too much focus on delivery systems? Distance Education Report,
3(4), 5.
Smith, P. L., & Dillon, C. L. (1999). Comparing distance learning and classroom learning: Conceptual
considerations. The American Journal of Distance Education, 13(2), 6-23.
Windschitl, M., & Sahl, K. (2002). Tracing teachers’ use of technology in a laptop computer school: The interplay
of teacher beliefs, social dynamics and institutional culture. American Educational Research Journal, 39(1),
165-205.
Yin, R. (2003). Case study research: Design and methods (3rd ed.). Beverley Hills, CA: Sage Publications.
352
The new global knowledge society and
the ICT implementation in K-12 schools
Eunhee J. O’Neill
University of Virginia
Internationally, many people recognize the concepts of e-learning, information and communication
technology (ICT) and of the global knowledge society. Scholars have discussed ICT implementation in education.
However, few teachers, school administrators or policy makers fully understand the relevance of such concepts in
light of the global knowledge society that we live in, and that will continue to develop, so that they fail to realize the
urgent need of ICT implementation in schools. Some of them dispute whether or not the use of ICT meets two main
educational functions: satisfying people’s intellectual desires and preparing individuals to participate in society. A
review of relevant literature and an investigation of international comparative studies of the utilization of technology
in both schools and society will elucidate the need for a proactive approach to the use of technology in education. In
addition, basic ideas regarding how to play a role in integrating ICT effectively will be introduced.
Introduction
We know that education and research play an ever-increasing role in economic growth. The capacity for
renewal is crucial. In order to realize Sweden’s potential for growth, we need to enhance our ability to generate
knowledge and to translate it into sustainable growth and new jobs. (The Ministry of Industry, Employment and
Communications & The ministry of Education, 2004)
The Swedish government released the white paper, Innovative Sweden: a strategy of growth through
renewal in 2004. It is not difficult to see that Sweden emphasizes the relationship between education and the
country’s economy. Also, it is clear that economic growth, for Sweden, is based on the ability to produce
knowledge. Sweden is but one example among many. The European Council (2000) set a 10 year target to become
the most competitive and dynamic knowledge-based economy in the world and placed education firmly at the top of
the political agenda in order to meet this challenge. At this point, we cannot help but question how knowledge can
be the foundation of the economy and why the role of education is more important than it ever was before.
Traditionally, technology has been considered an important factor in a country’s economic growth. With
his five stages of economic growth, Rostow (1960) demonstrated that technology would play a role in helping a
society to reach the stage of economic maturity. However, Rostow also emphasized that other factors such as
history, culture, and the active political process have interplayed with each other to determine the specific content of
the stages of growth for each society. For instance, when it comes to the economic success of South Korea, Sorensen
(1994) discussed the historical, philosophical, cultural, and political elements relating the educational power and
economic success of Korea. Thus, it is also difficult to internationally compare the potentials of economic
development of different nations through only a single aspect.
While acknowledging the importance of these various factors, this paper will not specifically discuss all the
factors that indicate the quality of a nation’s educational system, all of which significantly affect its economic
prosperity. Instead, this paper will specifically elucidate why we should focus on implementing technology in
education for the sake of a nation’s development by clarifying first, the characteristics of our society, both the
society of today and that of the future, and second, the meaning of E-learning, and finally, the relationship between
them. Making use of international comparative studies in terms of E-learning readiness and ICT implementation in
K-12 schools, this discussion will culminate in a suggestion of the importance of the roles of core stakeholders,
defined as schoolteachers, school administrators, and policy makers. Finally, this paper will address the idea that the
emphasis on education is not simply for the benefit of the economic growth of the nation, but for the self-
actualization of the individual, as well.
353
communication. ICT makes the fast and worldwide exchange of information possible, and has the capacity to
revolutionize work processes, service delivery, etc.” (www.flexibility.co.uk/helpful/glossary.htm).
Since communication is the core function of human life, most segments of society need ICT, which enables
people to have prompt access to information and to easily share it with a wide range of people. This is one feature of
today’s society and economy, which is information-oriented and requires effective communication. Whether an
economy will be prosperous or not is up to the efficient utilization of technologies allowing immediate and
appropriate information communication. The core technology for these needs is the Internet. In addition, rapidly
spreading wireless mobile PC and cellular phones accelerate connections to the Internet. These phenomena have
changed the nature of the world’s economy, bringing forth what will henceforth be called the “New Global
Economy” (ADB, 2003).
In the new global economy, the characteristics of goods are different from those of traditional goods. The
London School of Economics and Political Science (LSE) indicated that it is necessary, in defining such
characteristics, to be aware of the peculiarities of information-based products and the different ways the virtual
market and the physical market operate. “Commonly, knowledge-based digital products never run out and can be
used repetitively” (2000, cited in Nanclared 2001). Therefore, the sources of higher productivity increasingly rely on
knowledge and information applied to production, which is shifted from material goods. Firms’ competitiveness and
productivity are also dependent on the quality of information and the efficiency of attaining it. The organization of
production is changing from a mass, standardized model toward flexible, customized production, and from vertically
integrated large organizations to horizontal networks of economic units which participate in different stages of the
manufacture of a specific product under the interdependent relationship with other countries (ADB, 2003).
In the new global economy and society, which is characterized by knowledge and information-based
productivity, the quality of digital access as a medium for new product delivery is the key issue in achieving social
prosperity. Education functions to provide a better quality of digital access, and this is making people pay attention
to education’s significance in our new society; this is important because the education sector is too-often
overlooked. The International Telecommunication Union (ITU) reported the New Digital Access Index, the World’s
First Global ICT Ranking, in 2002. The results of the index suggest that it is time to redefine ICT access potential.
Michael Minges of the Market, Economics and Finance Unit at ITU says:
Until now, limited infrastructure has often been regarded as the main barrier to bridging the digital divide. Our
research, however, suggests that affordability and education are equally important factors. (Emphasis mine)
So far, it is clear that the productivity of the new society is based on knowledge and information, and the
prosperity of the society is mainly related to the potential of ICT access. In addition, the emphasis on education is
becoming more significant as one of the important factors in improving ICT access in a society. At this point,
educators should pay attention to how education can work in this relationship that exists between the new
knowledge-based society, technology (ICT), and schools (the education sector).
354
formal schooling and encompassing learning throughout the lifecycle, World Bank (2003) asserted that
opportunities for learning throughout one’s lifetime are vital in order for countries to be competitive in the global
knowledge economy which relies on the use of ideas and the application of ICT. The European Commission (2003)
also indicated that "e-learning is increasingly seen as a useful tool for achieving better access to lifelong learning
and education for all."
E-learning is characterized by formal and informal education and by information-sharing that uses digital
technology. It has substantially increased the opportunities of education (EIU, 2003). Individuals ranging from
kindergarten students to older adults use E-learning for their formal or informal education. Also, the range of E-
learning usages are almost limitless; that is, E-learning truly becomes integrated into people’s lives. Thus, preparing
students for lifelong learning is significant in terms of providing them with quality lives in the new society of the
21st century (OECD, 2000; Kozma, 2003).
Suppose an example of a Korean family who uses the E-learning system as a core way of living with ICT
digital access. The father, who is 40 years old and has a master’s degree from law school, consults customers
through his internet homepage. Well-known for his prompt and courteous answers to all questions, the website is
always busy with customers. Because his knowledge of the law is so proficient, it attracts many customers who
willingly pay membership fees to use the web site.
His wife enjoys interior design and is learning how to make decorative frames at home. The on-line
instructor demonstrates the methods through internet video conferencing systems. When the wife wants to ask a
question or to check if she is following the lecture correctly, she simply clicks her web camera button and shows her
work to the instructor in order to receive feedback. The reason that she chose the interior design web institute is that
its sites provide members with various services in addition to the lectures; for example, providing friends’ clubs,
holding face-to-face region by region meetings, offering video-lecture review services, and offering many classes
throughout the week so that she can take a class anytime she chooses.
Her oldest son came home from his regular middle school earlier and sat down in front of his computer in
order to “attend” class via the Internet with other students from all over the world. Today, he plans to do a
presentation in an American high school class. In the last class, one of his international colleagues brought up some
cultural issues and asked that students in each country prepare an assignment which would address the cultural
characteristics of the students’ respective countries. After concluding his presentation, he is delighted to see his
friends from other countries clapping to honor his presentation. This young man always studies hard because he
wants to enroll in one of the famous American virtual universities in the future.
His younger 10-year old sister is busy outside. She is doing group activities with her classmates. The group
members are scattered all over the region where ponds are present. Their homework is to investigate the quality of
water in ponds around their neighborhoods and to collect pictures of various kinds of fishes and plants in the ponds.
To do this, each member uses his or her own PDA—personal digital assistants—and records and beams the data to
each other to communicate. After this stage is completed, they are to upload their final assignment to the class
homepage on the Internet. The teacher will check their homework 3 hours later from home.
When dinnertime comes, the family sits together and starts talking about what they have done today. The
daughter wants to be excused early because her pen-pal from New Zealand will connect to her PDA messenger very
soon. The pen-pal was introduced to her through her private institute teacher. Her class is partnered with a class
from a New Zealand school so that they can communicate by e-mail or on-line chatting for the purpose of practicing
English. From the New Zealand school friends’ perspective, the arrangement is beneficial because they want to
know about their Korean friends’ life styles, hobbies and so on. Her mother also likes the institute’s service first,
since the price is not very expensive and second, because the daughter is now speaking English very fluently.
After finishing dinner, the father says good-bye to his family since he has been joining them via the Internet
from Japan where he is conducting a business trip. He turns off the camera on his laptop and his family turns off the
big TV screen connected to Internet.
The above story shows how E-learning embraces both informal and formal forms of education as a life-
long learning system, how information and communication technology (ICT) works for this system, and how
knowledge and information functions in the knowledge economy and the information society. Moreover, this story
describes that an E-learning environment using ICT fulfills individuals’ desires for learning as well as for working
successfully within a society. This is the significant difference between past technology and the current ICT in
schools and its current and future effects on society.
In other words, the new knowledge-based global society requires innovative technology such as ICT, and
E-learning, which makes use of ICT, occurs both in school and out of school. To participate in this new type of
learning, individuals need to acquire new ICT skills, and these skills allow individuals to access knowledge. That is,
individuals’ ICT skills satisfy their needs for “knowing” as well as provide opportunities for “occupation.” As
355
shown in the story, the methods of E-learning, vary from working alone, with a group, with individuals in other
countries, and so on (Moeng, 2004; Kozma et al, 1999). Eventually, both knowledge and skills work towards the
individual’s self-actualization—a realization of one’s full potential as a human being (Maslow, 1954). At the same
time, knowledge and skills also allow for the individual’s contribution to the prosperity of a society. One significant
reason why schools should actively implement ICT is that it allows schools to perform both sides of the academic
coin by equally studying academic knowledge and providing training for occupational skills – not only one or the
other.
Therefore, when it comes to ICT implementation in schools, teachers, school administrators, and policy
makers should keep in mind the important functions and roles of ICT in education. They also should understand the
particularities of the new society, and then actively foster the appropriate environments for the students’ learning
and participating in their future society. To do this it is necessary to check the current situation of ICT
implementation and investigate those exemplary countries successfully practicing its implementation.
E-learning Readiness
The EIU (2003) reported the E-learning Readiness Ranking by assessing four different categories:
education, industry, government, society. Each category has four components: connectivity (the quality and extent
of Internet infrastructure), capability (a country’s ability to deliver and consume E-learning, based on literacy rates,
and trends in training and education), content (the quality and pervasiveness of online learning materials) and culture
(behaviors, beliefs and institutions that support e-leaning development within a country).
According to the report, E-learning readiness indicates a nation’s ability to generate, use and expand
Internet-based learning – both informal and formal – at work, at school in government and throughout society. Thus,
the E-learning Readiness Rankings will show different countries’ different statuses in how they implement ICT in
entire sectors, including education. In addition, such rankings can suggest ways of encouraging government to
develop E-learning for the advantage of society and the economy (EIU, 2003).
356
France 8.00 9 6.81 17 (tie) 9.13 8 (tie) 6.80 18 7.51 14
Austria 7.75 17 6.81 17 (tie) 9.13 8 (tie) 6.96 14 7.49 15
Taiwan 7.92 13 7.52 9 7.53 25 (tie) 6.89 17 7.47 16
Germany 7.80 16 6.48 24 9.07 11 7.44 11 7.45 17
New Zealand 7.83 14 7.55 8 7.53 25 (tie) 6.38 23 7.37 18
Hong Kong 7.17 20 7.06 13 (tie) 8.47 20 6.93 15 (tie) 7.34 19
Belgium 7.83 14 6.26 25 (tie) 8.67 18 6.93 15 (tie) 7.19 20
Italy 6.79 23 6.52 22 (tie) 8.87 13 6.68 20 7.07 21
Spain 6.96 21 6.26 25 (tie) 9.13 8 (tie) 6.31 25 6.98 22
Japan 6.71 24 6.52 22 (tie) 6.60 32 6.33 24 6.53 23
Greece 6.40 26 5.87 28 (tie) 8.80 14 (tie) 5.66 28 6.52 24
Malaysia 6.25 27 6.94 15 7.07 28 (tie) 5.19 32 6.48 25
Israel 6.92 22 5.52 31 6.67 31 7.07 13 6.34 26
Portugal 6.42 25 5.29 32 (tie) 8.73 16 (tie) 5.93 27 6.33 27
Chile 5.77 30 5.29 32 (tie) 7.80 24 6.51 22 6.13 28
Czech
5.28 32 6.65 20 6.40 33 (tie) 5.58 29 6.11 29
Republic
Hungary 5.42 31 6.58 21 6.40 33 (tie) 5.50 30 6.09 30
<Table-1> will show overall rankings for 30 out of a pool of 60 different nations. <Graph-1> through
<graph-5> will demonstrate the relationship between each of four categories used both for rating individually within
these four categories and for the overall ranking. The graphs will show the comparisons between the nations ranked
as the top 1st through 8th in E-learning readiness. The purpose of providing these graphs is not to come to certain
conclusions, but to understand different countries’ strengths and weaknesses in their E-learning environments. In
addition, the Y-axis numbers simply refer to the relative scores in order to compare the rankings. The description of
categories extracted from the report and summarized will be beneficial in understanding each nation’s emphasis on
its E-learning system and in helping us decipher what could be the most effective factor in preparing for an E-
learning environment.
30
relative score
25
20
15 Educatn. Educatn. Industry Industry Govmnt. Govmnt. Societyion Society
Educat Overall Overall
10 Score Rank Score Rank Score Rank Score Rank Score Rank
5 (of 10) (of 60) (of 10) (of 60) (of 10) (of 60) Indust
(of 10)ry (of 60) (of 10) (of 60)
Weight0 in G ovm
20% 40% 20% 20% nt.
Overall score
a
k
d
e
en
Soci ety
S
K
ar
Sweden
or
re
U
ed
nl
Ko
ap
m
an
Fi
Canada
en
ng
h
O ver
C
ut
Si
357
<Graph-2> All categories-weak or strong category of each country
30
R elative scores
25
20
15
10 Education
5
0 Industry
G ovm nt.
a
k
d
e
en
K
ar
or
re
an
ad
U
ed
nl
Ko
Society
ap
m
an
Fi
en
Sw
ng
h
C
ut
Si
D
So
20
wealthy and poor communities
• Whether universities commonly offer Internet-based courses and degree programs Education
15
• The strength of the educational system as a whole: compulsory schooling years;
10 O verall
percentage of GDP spent on education; teachers’ payment; and how they are
regarded by 5
the
R elative rankings by industry community 0
D enm ark
and
Sw eden
C anada
Finland
South Korea
Singapore
US
UK
government
30
25
countries
R elative scores
20
Industry
15 <Graph-4> Relative rankings by industry
O verall
10
Industry
5 According to EIU’s report, the ranking is
based upon the following criteria:
0
ng a
k
ut nd
en
en e
a
K
ar
or
Si ore
ad
U
So inla
ed
ap
m
an
K
Sw
h
C
358
countries
• Internet access and usage within each economy’s major sectors: tertiary (services), secondary (manufacturing),
primary (agriculture and mining) and governmental
• Whether the Internet is exploited across each of these sectors, among small and large organizations alike, to
reach customers, enhance internal processes and train staff.
• Among the questions we asked: how do companies regard online degrees when selecting employees? Are
employees enthusiastic about Internet-based training programs?
• The E-learning industry itself, assessing the ease with which an E-learning company can set up and provide
services within a country’s regulatory regime
Government
R elative rankings by G ovm nt. According to EIU’s report, the ranking is based
upon the following criteria:
30
• Its attitude towards the Internet and E-learning
25
within its own agencies, within the public
R elative scores
C anada
Finland
South
US
UK
Society
R elative rankings by society
According to EIU’s report, the ranking is based
upon the following criteria:
30
• The population’s access to and use of the
25
Internet
R elative scores
C anada
Finland
South
Singapore
US
UK
359
Sweden (Europe) “Under the leadership of a working group comprising representatives of the Ministry of Industry,
Employment and Communications, the Ministry of Education and Science and the Ministry of Foreign Affairs, and
in cooperation with other relevant ministries, Sweden promotes the long-term growth of a knowledge-based
economy” (http://www.sweden.gov.se/content/1/c6/03/25/51/29e722a9.pdf).
Canada (North America) “Canadians are people who innovate, people who create ideas, and people who
implement those ideas. Invocation and learning are crucial to a high standard of living for Canadians. The National
Summit on Innovation and Learning engages with the private sector, the volunteer sector, educational institutions,
unions, and other levels of government and individuals…. Clearly, the Summit will be a milestone event in this
long-term strategy, where Canada will be moving from verification and refinement of the challenges facing the
North American nation to building the foundation for a national action plan that will guide their growth for the next
decade” (http://innovation.gc.ca/gol/innovation/site.nsf/en/in02168.html).
Korea (Asia) “The Korean government possesses a substantial capacity for establishing a knowledge and
information society. Such a society will enhance people’s quality of life through the effectiveness of its economic
and social activities which are to be organized under the umbrella of the country’s future information-oriented
system…. Understanding that information-oriented education is the foundation for building a knowledge and
information society, government, local self-governing bodies, related organizations, public education, information
associations and the private sector will systematically take and perform the necessary roles in order to accomplish
educational reform and the development of human resources.” (MEHRD & KERIS, 2003).
As shown above, these three nations consistently place a strong emphasis on the cooperation between
different entities within the national leadership including the governmental sectors related to science, culture,
economy, and education. Because of the importance of forging a new information-based society, each of these
countries also places an important focus upon cooperating with the private sector in order to achieve its goals. In
addition, these countries address long-term strategies for achieving long-term prosperity and strong national futures.
Considering that these three countries indicate such factors as “knowledge-based economy,” “knowledge and
information society,” “idea,” “learning,” “innovation,” and “educational reform” in their respective blueprints for
achieving national prosperity, in addition to the central role of the national leadership, it is clearly important to
prepare the new knowledge-based society with innovative educational systems. At this point, it is essential to
discuss how educational stakeholders should react to this change in order to make provisions for the new society. By
internationally comparing different countries’ ICT implementation strategies, IEA (Kozma, 2003) determined a
number of implications for schoolteachers, school administrators, and policy makers.
360
model, students not only collaborate with peers in their class to search for information, they also conduct research
and solve problems. This model requires tasks that are more complex for both teachers and students. Also, teachers
need to acquire new pedagogical skills and students should attain ICT skills, problem-solving skills, and
collaborative skills. Lastly, the outside collaboration model is conducted by providing students with access to
outside experts or to students and teachers from schools within one’s own nation or outside of the country. Teachers
may also use this model to foster inter or cross cultural understanding.
Conclusion
The nature of society is changing rapidly towards a knowledge-based economy, in which knowledge itself
is a product and a primary means for bettering human life. In such a knowledge-based society, both individual
success and a nation’s prosperity are determined by the capability of utilizing ICT (Information and Communication
Technology). In addition, an entire society is more controlled by people who work with ICT and for people who rely
361
on ICT. Such a society has more diverse and practically boundless sources utilizing the knowledge and
communication skills necessary to become prosperous. Due to the proliferation of social change, educational
systems are also being transformed into E-learning systems which use ICT, allowing people to achieve and produce
more knowledge in both breadth and depth. When it comes to educational functions—helping people to attain
knowledge and preparing them to have skills in order to adjust to the larger society, E-learning systems utilizing ICT
in education are ideal since E-learning fulfills both the educational goal of knowledge achievement and that of skill
acquisition. Therefore, it becomes clear that E-learning systems are essential in the new knowledge-based society
both for individuals’ success and for social growth. This is why the impact of education is, and needs to be, greatly
emphasized today more than ever.
Being aware of the changing natures of society and education, all educational stakeholders (schoolteachers,
school administrators, and policy makers) should take active roles to implement ICT in schools, which is critical in
order to operate E-learning, the new trend in education. The SITES Module 2 study (Kozma, 2003) suggests how
these stakeholders can play effective roles in implementing ICT in schools. Teachers should try to employ ICT in
the curriculum with instructionally well-designed teaching models so that students can learn problem-solving,
investigation, collaboration, and product development with ICT skills. School administrators should provide
teachers with a clear vision for the utilization of ICT and should support the teachers’ needs to implement ICT.
Lastly, policy makers should plan educational goals using ICT, apply them to schools, support teachers’ and
administrators’ professional development, invest funding into achieving the aforementioned goals, and then assess
the process and outcomes of ICT implementation. By analyzing the investment and outcomes according to the
educational goals, policy makers will see the effectiveness of their investment so that they will continue to rationally
strategize their on-going investment. Over the course of the entire process, it is important to keep congruent and
coherent connection of goals between vertically- and horizontally-bridged departments. Through the information
from an E-learning Readiness ranking report and international comparisons about leading countries’ fundamental
strategies in utilizing ICT, this paper also found certain directions toward what could be the most desirable strategies
for implementing ICT in schools—the governmental leadership which is highly organized and associated with other
organizations
However, I chose not to investigate the differences between ranking results and each country’s
particularities such as cultural, political, social and historical background. Although the two reports mention that the
measurements and analyses from comparative studies considered the differences in many variables between
countries, I did not have access to this data and I assume that there would likely be substantial errors when taking
into account a foreign country’s culture, history, and society. Thus, further research on closer comparisons of ICT
implementation regarding each country’s historical, social, cultural, and political background will be beneficial in
adapting more effective and specific methodologies. Moreover, I assume that there would be different strategies in
implementing ICT in developing countries’ respective societies and in participating in the global society. While I
only considered leading countries’ strategies and their school reform, another type of research— one dealing with
both developed and developing countries—would suggest more specific strategies according to nations’ different
characteristics so that we might prevent all countries from trying to imitate the most effective countries’ strategies,
which would likely be inappropriate for different environments. At the very least, understanding the relationship
between a society’s background and its effect on the society’s policies, would be interesting in and of itself.
References
Asian Development Bank, (2003). Key Indicators: Education for Global Participation
Bluton, C. (1999). New Direction in ICT-use in Education, Paris: UNESCO, from
http://www.unesco.org/webworld/com_inf_reports/wcir_99/wcir_en_all.pdf
Clark, D. (Nov 9, 2004). E-learning in Search of a Better Definition, from http://bdld.blogspot.com/2004/11/E-
learning-in-search-of-better.html
Copper, S. (2004). Does Education Matter? The Journal of Economic Education 35 No.1 98-100
Economist Intelligence Unit (2003). The 2003 E-learning readiness rankings, written in co-operation with IBM
European Commission (2003). eLearning: Designing Tomorrow’s Education, A Mid-Term Report
International Telecommunication Union, (2003). ITU Digital Access Index: World's First Global ICT Ranking,
Education and Affordability Key to Boosting New Technology Adoption
Kozma, R., (2000). ICT and Educational Reform in Developed and Developing Countries, SRI International, from
http://web.udg.es/tiec/orals/c17.pdf
Kozma, R., ed. (2003). Technology, Innovation, and Educational Change, A Global Perspective, A Report of the
Second Information Technology in Education Study (SITES) Module 2, A Project of the International
Association for the Evaluation of Educational Achievement (IEA) International Society for Technology in
362
Education (ISTE)
Kozma, R., Pelgrum, W., Owston, R., Vogt, Y., & McGhee, R., (2000) Qualitative, Studies of Innovative
Pedagogical Practices Using Technology: Research Design for the Second Information Technology in
Education Study (SITES) Module 2, Menlo Park, CA: SRI International
Maslow, Abraham H., (1954). Motivation and Personality, 2nd Ed.
Maslow's Hierarchy of Needs. from http://www.cultsock.ndirect.co.uk/MUHome/cshtml/popups/maslowh.html
Moeng, B. (2004). IBM Tackles Learning in the Workplace. Message posted to
http://allafrica.com/stories/200411081387.html
Ministry of Education & Human Resources Development (MEHRD) & Korea Education and Research Information
Service (KERIS), (2000~2004). White Paper: Adapting Education to the Information Age. from
http://english.keris.or.kr/etc/whitepaper.jsp
Nanclares, N. H., (2001). The So Called New Economy and the ICT: Concept and Measurement, Oviedo
University, The Brazilian Electronic Journal of Economy, Vol.4, No.1 from:
http://www.beje.decon.ufpe.br/v4n1/nanclares.htm
Organisation for Economic Co-operation and Development, (1999). OECD Science, Technology and
Industry Scoreboard 1999 Benchmarking Knowledge-based Economies, from:
http://www.hi.is/~joner/eaps/wh_oecd9.htm
Rock, A. & Stewart J., (2003). Ministers’ Message: Canadians Speak on Innovation and Learning. Message
posted to: http://innovation.gc.ca/gol/innovation/site.nsf/en/in02168.html
The World Bank Group, (2003). Lifelong Learning in the Global Knowledge Economy: Challenges for Developing
Countries – Ch.1: The Knowledge Economy and the Changing Needs of the Labor Market & Ch.3: Governing the
Lifelong Learning System, from:
http://www1.worldbank.org/education/lifelong_learning/lifelong_learning_GKE.asp
The Ministry of Industry, Employment and Communication & Ministry of Education, (2004). Innovative Sweden: A
Strategy for Growth through Renewal. from
http://www.sweden.gov.se/content/1/c6/03/25/51/29e722a9.pdf
363
Pre-service Teachers: Development of Perceptions and Skills Pertaining to
Technology Integration
Anne T. Ottenbreit-Leftwich
Cindy S. York
Jennifer C. Richardson
Timothy J. Newby
Purdue University
Abstract
Researchers continually explore more effective means for preparing pre-service teachers to integrate
technology in their classrooms (Jones, Cunningham, & Stewart, 2005). This study investigates pre-service teacher
perceptions of technology integration, as well as the development of their computer and technology integration
skills. Through quantitative and qualitative data sources, this study provides suggestions for the design of a more
effective preparation curriculum allowing for pre-service teachers to be better prepared for implementing
technology in their future classrooms. The results show 96% of the students who participated in the study reported a
positive change in technology beliefs due to participation in the course, as well as an increase in technology skills.
In addition, the results indicated students were apprehensive, but not reluctant to integrate technology into K-12
classrooms. Their open-ended responses showed their largest concerns related to a lack of resources and time to
use technology in the classroom. The implications from this study will be integrated into the course to address
concerns and meet the needs of more students.
Purpose
The research team examined an introductory technology integration course for pre-service teachers in an
attempt to create a more effective, efficient, and meaningful course for future teachers. The following research
questions were investigated:
1. What are students' perceptions of technology integration in the classroom and how has does a pre-service
technology integration course contribute to this understanding?
2. What are students’ concerns related to technology integration as they consider entering the classroom?
3. What recommendations would students make to instructors of a pre-service technology integration
course that can help make the course more meaningful?
Theoretical Framework
Since the advent of the computer and its increased ownership capabilities, technology integration in the
schools has become a large interest within K-12 education research. Some researchers still indicate there is no
evidence of increased achievement due to the use of technology and computers in K-12 education (Cuban, 2001),
however, with the increased use of technology in our society it is necessary to educate our students by teaching them
21st century skills (NCES, 2001; U.S. Department of Education, 2004b). Others argue technology increases
achievement in certain areas (Christmann & Badgett, 2003). Computers are becoming commonplace in schools
within the United States, with an average of 5:1 student to computer ratio in 2000-2001 (NCES, 2001). Some
research shows these computers are not currently being used to their full capacity (Cuban, 2001; Becker & Ravitz,
2001) emphasizing more teacher-centered tasks rather than student-centered learning experiences (Harris, 2005;
U.S. Department of Education, 2004a). Technology should be used to enhance learning by using it as an integrated
tool in the curriculum to assist students in obtaining, synthesizing, analyzing, and presenting information (U.S.
Department of Education, 2004a). Despite a lack of research evidence, in 1998, a survey conducted by Trotter
showed 74% of the public and 93% of educators agreed that computers had improved the quality of education,
teaching, and learning. In addition, the Department of Education’s 2004 National Education Technology Plan states
“teachers are transforming what can be done in schools by using technology to access primary sources, exposing
students to a variety of perspectives and enhancing students’ overall learning experience through multimedia,
simulations and interactive software”, in addition to tracking student achievement and adjusting instruction to better
meet individual needs (p. 5). One of the action plans established by the Department of Education is that all states,
districts, and individual schools “improve the preparation of new teachers in the use of technology” and to “ensure
that every teacher knows how to use data to personalize instruction” which can be accomplished with technology
(p.15).
364
While the claims of the Department of Education are debatable, researchers have indicated certain
conditions and teacher characteristics can provide situations where technology integration is more inclusive and
seamless. These can include adequate teacher technical skills, adequate computer access, and a belief that supports
meaningful learning in a more constructivist manner (Becker & Ravitz, 2001; York, Ottenbreit-Leftwich, & Ertmer,
2005). Educational technology has been defined by the Association for Educational Communications and
Technology (AECT) as “the study and ethical practice of facilitating learning and improving performance by
creating, using, and managing appropriate technological processes and resources” (The Meanings of Educational
Technology, 2004, p.3).
With the integration of technology, the role of a typical classroom begins to change and support a more
student-centered approach (Duhaney 2001; Newby, Stepich, Lehman, & Russell 2000). According to Cook and
Cook (1998), when learning experiences include more student-centered instruction, they are more likely to retain the
information. However, when using technology in these student-centered learning environments, it is important to
build in enough scaffolding and resources to guide students, especially those with special needs (Brush & Saye,
2000; Pedersen & Liu, 2003).
The lack of technology implementation could be due in part to teacher education programs, which are not
adequately preparing future teachers to use technology in the classroom (Doering, Hughes, & Huffman, 2003;
Hughes, 2003; Kleiman, 2004; Stetson & Bagwell, 1999; Strudler & Wetzel, 1999). According to Doering, Hughes,
and Huffman (2003), there are very few colleges of education who prepare their graduates to use technology in their
teaching.
The United States Department of Education explains the importance of teacher technology training in the
2004 National Education Technology Plan and The Secretary’s Fourth Annual Report on Teacher Quality. Within
these documents the authors call for action from schools, districts, and most prominently, higher education programs
(U.S. Department of Education, 2004a; U.S. Department of Education, 2004b). There have been many organizations
which have attempted to aid Universities in the development of more comprehensive programs addressing
technology integration. Perhaps the largest funding source is the Preparing Tomorrow’s Teachers to Use Technology
(PT3) program, which supplied large grants to individual Universities attempting to implement technology
integration standards into their teacher education programs. From 1999-2003, PT3 awarded 275 million dollars to
441 teacher training institutions (Brush et al., 2003). The Society for Information Technology and Teacher
Education (SITE) developed standards to help guide pre-service programs towards crucial elements necessary to
include in order to build the ability to integrate technology into their future classrooms. The standards address pre-
service teachers’ abilities to gain technology skills, plan and design technology-enhanced learning environments,
implement technology into the curriculum and activities, assess and evaluate students and learning activities,
continue professional growth with the assistance of technology, and understand the social, ethical, legal, and human
issues associated with technology (ISTE, 2000).
In addition, in 1998, SITE prepared a position paper, providing basic principles of technology integration
and suggested actions for Universities to consider as they design pre-service technology integration components.
The basic principles state technology should be (1) integrated throughout the entirety of the teacher education
program, (2) taught in authentic situations and, (3) shown through technology-enhanced learning environments
within the actual program. The authors proposed programs accomplish these goals by sharing results from quality
technology integration teacher programs, collaborating with K-12 schools with exemplary technology use, working
with a national center for technology and teacher education, train faculty to use technology, providing quality
models of technology-using K-12 teachers, and contributing funds to the development of quality teacher education
technology preparation materials (Thompson, Bull, & Willis, 1998).
Across programs and schools, pre-service teacher technology integration courses have been approached and
formatted in different ways (Marra, 2004); whether through large lectures, small computer lab classrooms, or
internships, many teacher education programs incorporate some method of technology integration instruction
(Snider, 2003). According to a study conducted to discover the current trend of educational technology courses in
pre-service teacher education (Tan, del Valle, & Perira, 2004), the approximate average of educational technology
course experience was one two-credit hour course in educational technology for pre-service teachers. In addition, a
large number (38.4%) of the 240 surveyed institutions did not require an educational technology course for pre-
service teachers, while 53.8% of the NCATE accredited teacher education programs required only one educational
technology course (Tan, de Valle, & Perira, 2004). However, one potential explanation could be technology is
integrated throughout the teacher education program seamlessly and therefore does not require a separate
educational technology course.
Bucci (2003) indicated many programs are using independent courses to facilitate technology integration
instruction. Through this method, many instructors have used lecturing techniques, discussions, and other strategies
365
to disseminate technology integration information. The single course approach is common in many universities and
colleges in teacher preparation focusing on technology integration. For example, teacher education programs at
Western Michigan University, Purdue University, Arizona State University, and Iowa State University focus
primarily on the single course approach to technology integration training, although course infusion may also play a
role. Other aspects of teacher education programs add to the success of pre-service teachers’ abilities to integrate
technology into the classroom such as open lab sessions, access to quality resources, student-centered learning, and
modeling techniques using technology within university courses (Stetson & Bagwell 1999).
Regardless of approach, technology integration courses are not always as effective in training teachers to
use technology as they could be (Moursund & Bielefeldt, 1999). Typical problems associated with using technology
in teacher education programs usually include "limited use of technology in teacher education courses, an emphasis
on teaching about technology rather than teaching with technology, lack of faculty modeling, insufficient funding
and faculty professional development opportunities, and lack of emphasis on technology in students' field
experiences" (Schaffer & Richardson, 2004, p. 423-424). One of the most problematic results from many teacher
education programs is that pre-service teachers receive a focus mainly on technology skills, as opposed to
knowledge of how, why, and when to integrate technology. “Although beginning teachers report wanting to use
computers, and have gained adequate technical skills, they typically lack knowledge about how to integrate
computers within the more routine tasks of teaching and managing their classrooms” (Ertmer, Conklin,
Lewandowski, Osika, Selo, Wignall, 2003, p. 95-96.)
Researchers are continually exploring more effective means in preparing pre-service teachers to integrate
technology into their classrooms (Evans & Gunter, 2004; Snider, 2003). Though all courses are established
differently, they all have the same end goal – effectively prepare teachers to integrate technology in the classroom.
Therefore, this study hopes to determine a more effective preparation curriculum where pre-service teachers will be
better prepared to implement technology in their future classrooms. The implications from this study will be
integrated in the course to address more concerns and meet the needs of more students.
Study Background
The study was conducted at a large Midwestern university where education students are required to take a
two-credit introductory educational technology course. The course consists of a large lecture component for 50
minutes each week, and a small accompanying laboratory (15-29 students per lab, 18 labs total) component for 110
minutes each week. The students within the course (n=429) are primarily education majors (91%), including
elementary (39%), secondary (52%), and special education (3%) majors (See Table 1). Within the College of
Education, students are separated into Blocks (I-VI) and each education major must complete these Blocks in order.
The Blocks include coupled courses in the teacher education program, and incorporate dual-purpose field
experiences throughout those block courses. Within each Block, students participate in a new field experience, titled
Theory-into-Practice, where pre-service teachers work in classrooms, starting with observations and assisting the
teacher (Block I, II, & V), to teaching small lessons (Block III & IV), and eventually lead into their student-teaching
experience (Block VI). The introductory educational technology course is labeled as an independent course, which
allows students to take the course wherever they have space in their schedule. However, most counselors
recommend taking the course freshman year. A majority of the sample consisted of freshman (56%) and sophomores
(29%), with a small number of juniors (11%) and seniors (4%). Because very few had entered Block I, a majority of
our sample lacked field experience in a K-12 classroom. The research team was comprised of two doctoral students
and one assistant professor in the educational technology program at a large Midwestern university. All three had
background in K-12 education, and were instructors in the introductory educational technology course. The team
worked together to establish the database and collect the data via a secure server maintained by the College of
Education. One of the doctoral students and the assistant professor worked together to analyze the data and write up
the report.
Table 1 Majors.
Major Frequency Percent
Art 4 2.5
Agriculture 5 3.1
Consumer Family Sciences 6 3.7
Technology Education 3 1.8
English 14 8.6
Early Childhood 3 1.8
366
Elementary 61 37.4
Math 12 7.4
Other 10 6.1
Physical Education 15 9.2
Science 5 3.1
Special Education 5 3.1
Spanish 1 .6
Social Studies 15 9.2
Undecided 4 2.5
Method
A mixed-methods design for this exploratory study was conducted to examine what students think about
technology integration after completing an introductory course on technology integration for pre-service teachers.
The study also investigated how to improve the course to better meet the needs and interests of future students, while
adequately preparing them for their role as future teachers. Participants were solicited from a large lecture course as
volunteers for evaluating the course. One hundred and sixty-three out of 429 students (38%) volunteered to participate in
the study as a form of extra credit, which was only given if they completed all three measures. The students were assured
the data would not be analyzed until after the semester was over and would be in aggregate form. Evaluation tools were
provided via a secure web-based survey.
Data Sources
The research team used three evaluation tools for data collection. The first evaluation tool was based on the
Stages of Concern survey (SoC), one of three diagnostic tools of the Concerns-Based Adoption Model (Hall, 1978).
For each question, students rated themselves ranging from one (not at all true of me) to seven (very true of me now).
If students found the question to be irrelevant, they answered with a zero. The SoC is a 35-item survey measuring
individual views and opinions of an innovation (Hall, 1978), in this case integrating technology into the classroom.
Specifically, the SoC measures participants’ concerns about an innovation based on stages ranging from stage 0
(awareness stage, meaning the student has little to no concern about the actual topic) to stage 6 (refocusing stage,
meaning the student is focused on how to improve the innovation and participates more in the revision of the
program). The research team also included seven demographic items (name, age, gender, race, current year,
projected grade for teaching, and projected teaching subject area) with an additional area for comments. The SoC
data was analyzed by combining all the interviewees’ data together and measuring the highest level of concern. The
highest level was distinguished for each individual student, and each stage was tallied to view the number of
students ranking it as their highest stage of concern.
The second evaluation tool used was a computer skills survey intended to measure whether students
perceived a change in their computer skills. Students were asked to report their perceived skills prior to the course
and at the end of the course, which is a limitation of the instrument since they were asked to rate both of these
perceptions at the end of the course. Questions included self-rated perceptions of certain tasks they are able to
accomplish in software programs such as word processing, spreadsheet, and html-editor programs, all programs that
are covered within the pre-service course.
The third evaluation tool contained five essay questions, which are included in the appendix, regarding
suggestions for changes and improvements to the course, as well as students' beliefs of technology integration.
Unfortunately, only 95% of the student volunteers answered these questions (n=155/163). Specific questions
included probes about current beliefs on technology integration and how the course has contributed to those views.
The responses were also coded into categories based on a loose pattern analysis, with more than one possible code
per student due to multiple points in each response.
Results/Discussion
Overall, this study attempted to investigate how a pre-service technology integration course affects student
perceptions of technology integration through three main research questions. From the first question, the team
explored students’ perceptions of technology integration in the classroom and how a pre-service course contributes
to these beliefs, which was answered through two of the essays. When students were asked to state their current
opinions on integrating technology into K-12 classroom, overall, student responses spoke positively (96%) of
integrating technology into K-12 Classrooms, while only 6 students referenced technology would not be used in
their classrooms and/or they saw no value in using technology. Table 2 shows the categorized comments of students
based on their current technology integration beliefs and percent agreement. The second essay asked students how
367
the pre-service course had affected their current view on technology integration. A majority of the students (n=125)
responded that their views had been positively affected by the course, while other made references to the
instructional design concepts within the course (n=15), and that they already knew technology was important prior
to the course (n=11). Therefore, the course seems to have positively changed students’ perceptions of technology
integration.
The second research question examined the concerns students have related to technology integration in the
K-12 classroom. The research team used the Stages of Concern test, along with one essay question to answer this
question. Within the SoC test, student responses were calculated and the highest score was classified as their highest
concern stage. A large number of students (n=87) indicated they were at stage zero (awareness stage). Since the
students are still within their pre-service education, they are classified as non-users and stage zero signifies they are
just becoming aware of technology integration. When the scores were averaged for all stages, it showed high scores
in stages zero, one, and two, with zero being the highest average score. According to Hall (1978), this implies they
are non-users who want more information about technology integration, and have intense personal concerns about
technology integration and its consequences for them. These concerns show apprehension, but not reluctance to
integrating technology into K-12 classrooms (Hall, 1978). The essay which supports the SoC findings asked what
were their concerns related to technology integration once they enter the classroom. Interestingly enough, students
ranked resources (36%) as their highest concern, with time (23%) being the next highest concern. Most concerns
were realistic, which is a good sign for this group composed primarily of freshmen and sophomores who have yet to
enter into field placements. They already have realistic concerns for the teaching profession. Other concerns
included: ability to choose the correct technology for their curriculum (19%), student issues, including access
problems and prior skills (18%), the school and district’s perceptions of technology integration (10%), and the
inability to keep up with the new technologies available (6%).
The final research question investigated how the instructors of the pre-service technology integration
course could make the course more meaningful and applicable. Two open-ended questions specifically asked for
suggestions on how to improve the lecture and lab, providing direct complaints, concerns, ideas, and solutions from
the students. Suggestions for lecture improvement included more modeling of different types of technology
integration (29%), more varied topic areas (17%), more examples in varying subjects/grades (14%), and involving
individual students more (12%). Suggestions for lab improvement included more explanations on difficult programs
and/or projects (17%), the pace for laboratory needed to be faster/slower (10%), more relevant examples and
homework (8%), and more concise directions (6%). Another common concern throughout both questions was the
lack of cohesion between lecture and lab. Many students (25%) stated “I feel that the labs were not connected AT
ALL to the lecture.” Other recommendations were extracted from the SoC test, the Skills survey and the other
essays. In order to meet the needs of the students, the course needs to address the concerns of integrating technology
expressed by students in the SoC and essay questions. The skills survey showed every student improved on every
sub-skill within the four skill programs focal to the course labs (Microsoft Word, Excel, PowerPoint, and
FrontPage). When a t-test was preformed with each prior to the course and current sub-skill, all results were
significant at the .001 level. For instance, some of these items measured whether students felt they were able to
“create a table with 3 rows and 4 columns” and “add and edit images to a web page.”
Implications
Based on the three different evaluation tools, the researchers’ intent was to gather information in order to
explore more effective means in preparing pre-service teachers to integrate technology into their classrooms.
368
Through the design of a more effective preparation course, the pre-service teachers will be more prepared to
implement technology into their future curriculum and classrooms. Because the teacher education program
schedules a large lecture with smaller accompanying labs, instructors need to ensure students are active participants
within the large lecture. One method being implemented in the current semester as a result of this study is to
organize the students in the lecture in seated sections based on their labs. Therefore, lab instructors will attend the
large lectures and manage small discussions within the lecture. We also plan to address the common criticism of
lack of continuity between the lecture and lab. Another strategy being implemented in the current semester involves
online lectures being provided for students that focus on the course readings and basics, allowing more time for
discussion in lecture related to integration and real-world experiences. In addition, lab instructors are integrating the
lecture into the labs by spending the first part of each lab discussing course activities, projects, and homework and
how they relate to topics discussed in the lecture.
Most importantly, all technology integration courses should address the concerns of future teachers. We
recommend addressing as many concerns as possible through lecture discussions, presentations, or lab activities.
Since the lack of technology resources and lack of time to dedicate to technology integration were both major
concerns for students, the course should focus on strategies to overcome these potential problems Other concerns,
such as the ability to choose the correct technology for their curriculum, student issues, including access problems,
prior skills, and inability to keep up with the new technologies available are easily addressed through case studies,
vignettes, and discussions in the lecture and labs along with discussions of potential strategies to overcome these
particular "concerns".
Broader implications include informing instructors of other pre-service technology courses in terms of
student concerns and the need to address them. Implications for instructors of large lecture courses in any content
area are also included in terms of instructional strategies that can make such courses more cohesive while allowing
students to see relevance of the content to the real world. .
Additional research is needed in the area of preparing pre-service, as well as in-service, teachers to
integrate technology in effective, efficient, and meaningful ways. Future plans include examining student concerns
and beliefs, as well as better instructional strategies to improve learning not only for the content taught in this
course, but for large lecture courses in general.
References
Becker, H. J., & Ravitz, J. L. (2001). Computer use by teachers: Are Cuban’s predictions correct? Paper presented at
the 2001 Annual Meeting of the American Educational Research Association, Seattle, WA.
Brush, T., & Saye, J. (2000). Implementation and evaluation of a student centered learning unit: A case study.
Educational Technology Research and Development, 38(3), 79-100.
Brush, T., Glazewski, K., Rutowski, K., Berg, K., Stromfors, C., Van-Nest, M. H., Stock, L., & Sutton, J. (2003).
“Integrating technology in a field-based teacher training program: The PT3@ASU project.” Educational
Technology Research and Development. 51(1), 57-72.
Bucci, T.T. (2003). The technology teaching lab: Meeting the ISTE challenge. Action in Teacher Education, 24(4),
1-9.
Christmann, E. P. & Badgett, J. L. (2003). A meta-analytic comparison of the effects of computer-assisted
instruction on elementary students' academic achievement. Information Technology in Childhood
Education, 91-104.
Cook, J. S., & Cook, L. L. (1998). How technology enhances the quality of student-centered learning. Quality
Progress 31(7), 59-63.
Cuban, L. (2001). Oversold and underused: Reforming schools through technology, 1980-2000. Cambridge MA:
Harvard University Press.
Doering, A., Hughes, J., & Huffman, D. (2003). Preservice teachers: Are we thinking with technology? Journal of
Research on Technology in Education, 35(3), 342-61.
Duhaney, D. C. (2001). Teacher education: Preparing teachers to integrate technology. International Journal of
Instructional Media, 28(1), 23-30.
Ertmer, P. A., Conklin, D., Lewandowski, J., Osika, E., Selo, M, & Wignall, E. (2003). Increasing preservice
teachers' capacity for technology integration through use of electronic models. Teacher Education
Quarterly, 30 (1), 95-112.
Ertmer, P. A., Ottenbreit-Leftwich, A. T., & York, C. S. (2005). Exemplary technology use: Teachers' perception of
critical factors. Paper presented at the annual meeting of the Association for Educational Communications
and Technology, Orlando, FL.
Evans, B. P., & Gunter, G. A. (2004). A catalyst for change: Influencing preservice teacher technology proficiency.
369
Journal of Educational Media and Library Sciences, 41(3), 325 – 336.
Harris, J. (2005). Our agenda for technology integration: It's time to choose. Contemporary Issues in Technology
and Teacher Education, 5(2).Retrieved September 26, 2005, from http://www.citejournal.org/vol5/iss2/
editorial/article1.cfm
Hughes, J. E. (2003). Toward a model of teachers' technology-learning. Action in Teacher Education, 24(4), 10-17.
ISTE. (2000). ISTE national educational technology standards (NETS) and performance indicators for teachers.
Retrieved August 8, 2005, from http://cnets.iste.org/teachers/pdf/page09.pdf
Kleiman, G.M. (2004). Myths and realities about technology in k-12 schools: Five years later. Contemporary Issues
in Technology and Teacher Education, 4(2), 248-253.
Long, D., & Riegle, R. (2002). Teacher education: The key to effective school reform. Bergin & Garvey. Westport,
CT.
Marra, R. (2004). An online course to help teachers “use technology to enhance learning”: Successes and
limitations. Journal of Technology and Teacher Education, 12(3), 411-429.
Moursund, D., & Bielefeldt, T. (1999). Will new teachers be prepared to teach in a digital age? A national survey
on information technology in teacher education. Santa Monica, CA: Milken Exchange on Education
Technology.
Newby, Stepich, Lehman & Russell. (2000). Instructional technology for teaching and learning. Upper Saddle
River, NJ: Prentice-Hall, Inc.
Pedersen, S., & Liu, M. (2003). Teachers’ beliefs about issues in the implementation of a student-centered learning
environment Educational Technology Research and Development, 51(2), 57-76.
Robinson, R., & Molenda, M. (2004, July 1). The meanings of educational technology. Retrieved September 26,
2005, from http://www.indiana.edu/~molpage/Meanings%20of%20ET_4.0.pdf
Schaffer, S. P. & Richardson, J. C. (2004). Supporting technology integration within a teacher education system.
Journal of Educational Computing Research, 31(4), 423-435.
Schrum, L., Skeele, R., & Grant, M. (2003). One college of education's effort to infuse technology: A systemic
approach to revisioning teaching and learning. Journal of Research on Technology in Education, 35(2), 256
-271.
Snider, S. L. (2002). Exploring technology integration in a field-based teacher education program: Implementing
efforts and findings. Journal of Research on Technology in Education, 34(3), 230-249.
Stetson, R., & Bagwell, T. (1999). Technology and teacher preparation: An oxymoron? Journal of Technology and
Teacher Education, 7(2), 145-52.
Strudler, N. & Wetzel, K. (1999). Lessons from exemplary colleges of education: Factors affecting technology
integration in preservice programs. Educational Technology Research and Development, 47(4), 63–81.
Tan, A., del Valle, R., & Pereira, M. (2004). Required educational technology courses in NCATE accredited
preservice teacher programs. Presented at IST Conference in Bloomington, IN.
Thompson, A., Bull, G., & Willis, J. (1998). SITE position paper: Statement of basic principles and suggested
actions ('ames white paper'). Retrieved September 6, 2005, from http://www.aace.org/site/
SITEstatement.htm
Trotter, A. (1999). Preparing teachers for the digital age. Education Week, 19: 37-43.
U.S. Department of Education, Office of Educational Technology. (2004a). Toward a new golden age in American
education: How the internet, the law and today’s students are revolutionizing expectations. Washington,
D.C.
U.S. Department of Education, Office of Educational Technology. (2004b). The secretary’s fourth annual report: On
teacher quality a highly qualified teacher in every classroom. Washington, D.C.
370
Evaluation of a Pre-Service Educational Technology Program
Within a U.S. University on the Border
Cheng-Chang Pan
Michael Sullivan
Juan Hinojosa
The University of Texas at Brownsville/Texas Southmost College
Abstract
The presentation is intended to delineate and disseminate measurement and evaluation information with a
focus on computer self-efficacy of Hispanic pre-service teachers enrolled in an undergraduate educational
technology course during the school year of 2004 and 2005 at a southern state university on the border of the
United States and Mexico. The influence of these student teachers’ prior experience with and access to the Internet
on their elf-Efficacy for Technology Integration is also studied.
Hispanics’ Growth
According to a CNN news report on October 18, 2004, Hispanic groups represented almost 14% of the U.S.
population in 2003, which is a four percent increase from 1990. The U.S. Census Bureau also reported a 13%
growth in Hispanic population from 2000 to 2003. CNN also reported that projected population of Hispanics in 2050
is 24%. This study reflects Hispanics’ growth in a timely manner.
The setting for this study consists of 92% Hispanic population. We felt obliged to further investigate these
possible implications for pre-service teachers’ Self-Efficacy for Technology Integration. For an effective learning
environment, Griggs and Dunn (1996) found the majority of Hispanic-American learners have a tendency for the
following eight preferences:
1. An aesthetically well-designed and cool environment;
2. Conformity;
371
3. Peer-oriented learning;
4. Kinesthetic instructional resources;
5. A high degree of structure;
6. Late morning and afternoon peak energy levels;
7. Variety as opposed to routines;
8. A field-dependent cognitive style.
Significance
By conducting this study, we have made a big stride in understanding their students regarding their
behavior patterns and traits. By collecting and analyzing the students’ learning styles, the instructors were better able
to respond to needs of students. Instructors were also able to more effectively resolve complex technology
integration issues such as cooperative learning with technology. Students’ Self-Efficacy for Technology Integration
at large is deemed a determinant of teaching effectiveness and excellence. By having an established baseline for
both students’ confidence level and comfort zone with respect to incorporating technology into curriculum, the
instructors could use the results can to develop a gear-up, or remedial course, prior to or after the mandatory
technology course.
Research Questions
1. To what degree does Self-Efficacy for Technology Integration change from Time One to Time Two Formatted: Bullets and Numbering
during the year?
2. To what degree does Behavior Pattern interact with Semester on Self-Efficacy for Technology
372
Integration?
3. What demographics variables can moderate Self-Efficacy for Technology Integration?
Table 1 Reliability Testing of the Self Efficacy for Technology Integration Instrument in Alpha Value by Semester
Fall Semester of 2004 Spring Semester of 2005 Summer Semester of 2005
Subscale Time 1 Time 2 Time 1 Time 2 Time 1 Time 2
373
Results
Question One
To what degree does Self-Efficacy for Technology Integration change from Time One to Time Two during
the year?
A t-test for independent samples was conducted for the Fall 2004 dataset (n = 73). A t-test for dependent
samples was used to analyze datasets from Spring 2005 (n = 61) and Summer 2005 (n = 30). There is a statistically
significant difference in the mean Self-Efficacy for Technology Integration scores between Time One and Time
Two in each semester (see Table 2).
Question Two
To what degree does Behavior Pattern interact with Semester on Self-Efficacy for Technology Integration?
Due to the sparse distribution of participants’ behavior patterns, we collapsed the behavior patterns variable
into a dichotomy: Aggressive Dependent (AD) and Non-Aggressive Dependent (NAD).
On Time One, the AD group represented 57% of the total respondents while the NAD was 43%. Given that
the assumption of equal variances was met (p = .53), a t-test for independent variables was conducted. The mean
Self-Efficacy for Technology Integration score in the AD group does not exceed the mean Self-Efficacy for
Technology Integration score in the NAD group during the year to a statistically significant degree, t(170) = .44, p >
.05. Given that the assumption of equal variances was met (p = .29), a two way analysis of variance was used. No
statistically significant interaction effect between Behavior Pattern and Semester was found, F(2, 166) = .71, p = .50.
No statistically significant difference between the group means for AD and NAD was found, F(1, 166) = .02, p =
.90.
On Time Two, AD and NAD represented 59% and 41% respectively. Given that the assumption of equal
variances was met (p=.13), a t-test for independent variables was conducted. The mean Self-Efficacy for
Technology Integration score in the NAD group does not exceed the mean Self-Efficacy for Technology Integration
score in the AD group during the year to a statistically significant degree, t(162) = -1.16, p > .05. Given that the
assumption of equal variances was met (p = .20), a two way analysis of variance was used. No statistically
significant interaction effect between Behavior Pattern and Semester was found, F(2, 158) = .52, p = .60. No
statistically significant difference between the group means for AD and NAD was found, F(1, 158) = .48, p = .49.
Question Three
What demographics variables can moderate Self-Efficacy for Technology Integration?
After recoded, Sex (Male vs. Female), Work (Full-timers vs. Non-Full-timers), Prior Experience with the
Computer (Up to Six Years Experience vs. Over Six Years Experience), and Access to the Internet (Yes vs. No)
were taken into account in a dichotomy.
On Time One, given that the assumption of equal variances was met (p = .19), a two way ANOVA was
used. No statistically significant interaction effect between Sex and Semester was found, F(2, 163) = 2.91, p = .06.
No statistically significant difference between the group means for Male and Female was found, F(1, 163) = 1.03, p
= .31.
Given that the assumption of equal variances was met (p = .15), a two way ANOVA was used. No
statistically significant interaction effect between Work and Semester was found, F(2, 166) = .06, p = .94. No
statistically significant difference between the group means for Full-timers and Non-Full-timers was found, F(1,
166) = .002, p = .96.
Given that the assumption of equal variances was met (p = .20), a two way ANOVA was used. No
statistically significant interaction effect between PC Experience and Semester was found, F(2, 163) = .96, p = .39.
No statistically significant difference among the group means for Semester was found, F(2, 163) = 1.83, p = .16.
However, a statistically significant difference between the group means for PC Experience was found suggesting
that our data are unlikely, assuming that the null hypothesis is true, F(1, 163) = 8.52, p < .01. We therefore reject the
null hypothesis in favor of the alternative which states that a difference exists between the PC Experience means in
the population (R² = .05).
374
Given that the both assumptions of equal variances (p = .20) and n’s were not met, results of a two way
ANOVA with Internet Access and Semester as the two levels was not reported. However, a t-test for independent
variables was used to examine the mean difference in Self-Efficacy for Technology Integration between students
with Internet Access and those without Internet Access. No statistically significant difference between the group
means for Semester was found, t(167) = 3.38, p < .01.
On Time Two, given that the assumption of equal variances was met (p = .19), a two way ANOVA was
used. No statistically significant interaction effect between Sex and Semester was found, F(2, 157) = .62, p = .54. No
statistically significant difference between the group means for Male and Female was found, F(1, 157) = .08, p =
.78. Given that the assumption of equal variances was met (p = .24), a two way ANOVA was used. No
statistically significant interaction effect between Work and Semester was found, F(2, 157) = .43, p = .65. No
statistically significant difference between the group means for Full-Timers and Non-Full-Timers was found, F(1,
157) = 3.11, p = .80.
Given that the both assumptions of equal variances (p < .001) and n’s were not met, results of a two way
ANOVA with PC Experience and Semester as the two levels was not reported. However, a t-test for independent
variables was used to examine the mean difference in Self-Efficacy for Technology Integration between students
with more than six years of PC experience and those with no more than six years of PC experience. A statistically
significant difference between the group means for PC Experience was found, t(161) = 4.52, p < .001.
Given that the assumption of equal variances was met (p = .24), a two way ANOVA was used. No
statistically significant interaction effect between Internet Access and Semester was found, F(2, 157) = .11, p = .90.
No statistically significant difference between the group means for students with Internet access and those without
Internet access was found, F(1, 157) = 3.11, p = .80.
Conclusions
This year-long quantitative inquiry was intended to investigate the effectiveness of a Hispanic-dominated
pre-service computer literacy program in a border university. The effectiveness was operationalized and determined
primarily by increased self-efficacy for incorporating computer technology into curriculum upon the completion of
the computer literacy course. A questionnaire comprised of three measures: Self-Efficacy for Technology
Integration Instrument, Long and Dziuban Learning Style Inventory, and Student Demographics was administered at
two time occasions in each of the three semesters during 2004 and 2005. Data were complied in an Excel file and
then imported to SPSS v.13 for further analysis.
These student teachers’ confidence in incorporating appropriate technologies into curriculum changed from
Time One to Time Two in the Fall Semester of 2004, the Spring Semester of 2005, and the Summer I Semester of
2005 to a significant degree. This suggested that effectiveness of the mandated computer literacy course is
determined. Another indicator of the effectiveness of the course is student end-of-class grade. The relationship
between the end-of-class grade and the overall self-efficacy scores was not found. Because there is no significant
difference in total self-efficacy scores among semesters either in the beginning or at the end of each semester,
student teachers we had each semester seemed to start out the course with a similar confidence level in terms of
Self-Efficacy for Technology Integration. Upon completion of the course, they appeared to have acquired a similar
confidence level in instructional use of technologies at the class level. A longitudinal study is needed to make a
confident statement in this area.
Based on the test results in the Question Two section, student learning styles did not seem to affect their
overall self-efficacy either in the beginning or at the end of each semester. An aggressive-dependent learner
confidence of integrating technology into curriculum did not differ from that of a non-aggressive-dependent learner.
Further analysis should focus on the subscale level of Self-Efficacy for Technology Integration and any potential
impact of the four auxiliary traits of Behavior Pattern.
Students’ demographics: Sex, Work, Prior Experience with the Computer, and Access to the Internet were
considered in an investigation of their moderating effects on overall self-efficacy of this mostly Hispanic student
group. For the purpose of this study, the demographics were turned into dichotomous variables. Results suggested
that students with more than six years experience of using a computer seemed to feel more confident than those with
no more than six years experiences of using a computer when they started the course and when they completed the
course. As computers receives acceptance in the lower Rio Grande Valley, this observation may not hold true in the
long run. Perhaps attention should be placed on these students’ social economic background.
Due to the fact that data were collected from one single university, cautions apply when generalizing these
results to similar settings.
Acknowledgement
This paper is dedicated to the memory of Dr. Thomas Eugene Bayston, Jr. who provided great inspiration
375
on this study.
References
Bayston, T. E., Jr. (2002). A study of reactive behavior patterns and online technological self-efficacy. Unpublished
doctoral dissertation, University of Central Florida, Orlando.
Dziuban , C. D., Moskal, P. D., & Dziuban , E. K. (2000). Reactive behavior patterns go online. The Journal of
Staff, Program, & Organizational Development, 17(3), 171-182.
Griggs, S., & Dunn, R. (1996). Hispanic-American students and learning Style. ERIC Digest. Urbana, IL: ERIC
Clearinghouse on Elementary and Early Childhood Education. (ERIC Document Reproduction Service No.
ED393607)
Kysilka, M. L., & Geary, M. (2003). Student personality profiles and success in a virtual high school for at-risk
students. Proceedings of Hawaii International Conference on Education, USA. Retrieved October 1, 2005,
from http://www.hiceducation.org/Edu_Proceedings/Marcella L. Kysilka.pdf
Ouellette, R. (2000). Learning styles in adult education. Retrieved October 1, 2005,
http://polaris.umuc.edu/~rouellet/learnstyle/learnstyle.htm
Pan, C. (2003). System use of WebCT in the light of the technology acceptance model: A student perspective.
Unpublished doctoral dissertation, University of Central Florida, Orlando.
Pan, C., Tsai, M., Tsai, P., Tao, Y., & Cornell, R. A. (2003). Technology's impact: Symbiotic or asymbiotic impact
on differing cultures? Educational Media International, 40(3 & 4), 319-330.
Sibley, P. H. R., & Kimball, C. (2003). Technology planning: the Good, the bad and the ugly. Retrieved October 1,
2005, from http://www.microsoft.com/education/PlanTMM.aspx
Wang, L., Ertmer, P. A., & Newby, T. J. (2004). Increasing preservice teachers’ self-efficacy beliefs for technology
integration. Journal of Research on Technology in Education, 36(3), 231-250.
Willis, E. M., & Raines, P. (2001). Technology in secondary teacher education. T. H. E. Journal (Technological
Horizons In Education), 29(2), 54, 56, 58, 60, 63-64. Retrieved October 1, 2005, from
http://www.thejournal.com/magazine/vault/A3638.cfm
376
Examining Barriers Middle School Teachers Encountered in Technology-
Enhanced Problem-Based Learning
Abstract
This study focused on the barriers that middle school teachers face when attempting to implement
technology-enhanced problem-based learning. We examined the perceptions of teachers, administrators, university
faculty, and technical support staff to determine the relative importance of barriers to the implementation of
technology-enhanced PBL. We determined the relative importance of the barriers to be in the order of lack of vision
sharing, knowledge and skills, motivation, capacity, tools, expectations and rewards.
377
Figure 3: The performance pyramid (Wedman & Graham, 2004)
These factors are all influenced by the overall culture of the impacted organization. In addition to culture, the vision
(mission, goals) of the organization must be taken into account as well as any resources (workers, capital, time) that
are available to those affected within the organization.
This study was designed to examine the barriers teachers encounter when planning for and implementing
PBL in the middle school classroom using a performance support systems approach. Specifically the research
questions were:
• What are the gaps in performance between expert and typical PBL teachers
• What barriers do teachers encounter when planning and implementing PBL in the middle school classroom
and what is their relative importance?
Method
Overview
378
We collected four types of data during fall 2004 to determine the gaps between expert and typical PBL
teachers, and to identify and determine the relative importance of the barriers teachers encounter while
implementing PBL processes. The data sources included a survey of teachers’ perceptions of the barriers
encountered while planning for and implementing PBL (See Appendix A), face-to-face interviews of eight teachers
implementing PBL, observations in four classrooms during PBL lessons, and researchers’ reflective journals.
Procedure
To answer our first research question - What are the gaps in performance between expert and typical PBL
teachers – we observed 13 class hours led by 5 teachers we labeled “expert” or “typical.” Expert teachers were
identified by three criteria: 1) conducted 3-4 previous PBL units, 2) attended at least one professional conference,
and 3) acknowledgment by both school administrators (a superintendent, a principal, a project manager) and PBL
support faculty. Teachers who were not labeled “expert teachers” were labeled typical teachers for the purposes of
observation. We used an observation checklist during each observation. The classroom observation checklist was
developed based on a list of PBL best practices synthesized from a review of the literature and interviews with PBL
experts. Two PBL experts reviewed the checklist. This checklist was used to help identify some of the differences,
and hence the gap, between expert PBL teachers and typical PBL teachers. The checklist included six categories
related to implementing PBL: 1) Pedagogical beliefs (student-centered learning), 2) technology use for higher-order
thinking, 3) planning and organizing techniques, 4) classroom management skills, 5) collaboration, and 6)
professional development. We asked each teacher we observed about what type of and amount of professional
development in which they engaged (e.g., PBL workshop, presentation at a professional conference). Each
observation was completed by two researchers, who then came to a consensus on the traits and practices on the
checklist that were observed.
To answer our second research question - What barriers do teachers encounter when planning and
implementing PBL in the middle school classroom and what is their relative importance? – we used a survey,
interviews with stakeholders (eight teachers, two school administrators, one project manager, two technical support
staff, two university PBL faculty members), and our reflective journals. We based the survey on Wedman and
Graham’s (2004) performance pyramid and used it to identify barriers teachers encounter during the PBL process.
The pyramid includes factors such as knowledge and skills, capacity, motivation, environment and tools,
expectations and feedback, and rewards and incentives. The survey included nine forced-choice questions and one
open-ended question. For the forced-choice questions, teachers were asked to indicate whether they agreed,
disagreed, or were unsure if certain supportive factors were present during their PBL efforts (e.g., “Expert PBL
support is available in a timely and helpful manner in our school”.” I have received explicit expectations regarding
the implementation of problem-based learning (PBL) in my school.”). For the open-ended questions, teachers were
asked to list any specific barriers that they perceived as being personal barriers to PBL implementation.
Interviews were conducted with a variety of stakeholders to examine the perceived barriers to
implementing PBL from different viewpoints. Interviewees were asked about the current and ideal status of both the
organizational support and the PBL practices of teachers.
After completing each observation, the researchers also recorded practices in a reflective journal that may
not have been captured by the observation checklist. These data were used to triangulate the other data sources. We
determined a relative importance of the barriers according to each data source, and then we combined those relative
importances to determine the final relative importance.
Results
What are the gaps in performance between expert and typical PBL teachers?
Analysis of the observational data indicated that there were large gaps between the performance of typical
379
and expert PBL teachers on many of the best practices included on our checklist, especially on those practices that
are important to the successful implementation of PBL, and not necessarily to the implementation of other
pedagogical strategies. Some of the biggest gaps between expert and typical PBL teachers were in the areas of
having students self-evaluate and reflect on the problem-solving process, providing students with self-monitoring
guidelines, and collaborating with other teachers. These practices are desirable to maximize the potential of a PBL
unit to promote learning because students engaged in PBL learn not only by engaging in the problem-solving
process, but also by actively reflecting on it (Hmelo-Silver, 2004). Students also gain from being involved in
interdisciplinary PBL units (Stepien, Gallagher, & Workman, 1993), and need to be provided with tools to help them
self-monitor (Brush & Saye, 2000).
What barriers do teachers encounter when planning and implementing PBL in the middle school classroom
and what is their relative importance?
Upon analysis of data, we determined the relative importance of the barriers to be in this order: vision-
sharing, feedback and expectations, knowledge and skills, motivation, rewards and incentives, and tools and
environment. In the paragraphs that follow we will report the data that supports the relative importance.
Vision sharing. Interview data suggested that a lack of vision sharing was a major barrier to the effective
implementation of PBL. During the interviews we learned that administrators believed that the overall purpose of
technology-enhanced PBL was to increase student-centered learning through the use of technology, whereas PBL
support faculty involved with the project believed that the goal was to promote pedagogical change through the
implementation of a more student-centered approach to instruction. When we interviewed teachers, they indicated
confusion about the goals of the PBL initiative. When asked why the school expects all teachers to implement PBL,
one teacher said, “That’s where I have a little bit of a problem because I’m not sure what they are trying to
accomplish.” Another teacher stated, “I think they are trying to be innovative and do things other schools aren’t.”
Feedback and expectations. On the survey, only 5 of the 21 teachers agreed that they received regular and
helpful feedback on their implementation of PBL. Many of the teachers that we interviewed mentioned that they
never received feedback based on their implementation of PBL. One teacher asked, “Is there any formal evaluation
or feedback?” Another noted, “They haven’t really checked on us; I don’t really know how they would know if I did
it.”
Knowledge and skills. On the survey, 15 teachers indicated that they have the knowledge and skills to
effectively implement PBL in the classroom, while 6 indicated that they do not. Many teachers also indicated that
they lacked knowledge related to how to plan and implement PBL in their classrooms. One teacher said, “I still
don’t know what I’m doing or if I’m doing it right.” Another teacher explained, “A half day of workshop wasn’t
enough time to develop or even understand.”
Motivation. Fifteen out of 21 teachers indicated in the survey that they were motivated to implement PBL
in their classes, while 3 disagreed and 3 were unsure. When asked in the interview what motivates them to
implement PBL, 4 teachers indicated that it was the students’ engagement in PBL units. One of these teachers
mentioned, “I like to watch them get excited about what they are doing.” However, one teacher noted that the only
motivation that he had was that it was a requirement, explaining, “If my boss says to do it, I do it.”
Rewards and incentives. Nine teachers who completed the survey agreed that the school offered rewards
and incentives for the implementation of PBL, while 9 disagreed and 3 were unsure. Many teachers mentioned in the
interviews that there were rewards in that students gained ownership in and enjoyed learning more. One teacher
explained, “What is neat is that at the end of these PBL units they usually have more questions than answers…all of
a sudden the world is a much bigger place and PBL allows that to happen.” However, some teachers did not
perceive that there were any rewards and incentives. When asked what rewards there were with regards to PBL, one
said, “Nothing for me, except that it’s my job.”
Tools and environment. Teachers completing the survey were also asked to indicate their agreement with
the statement: “The physical environment and tools (hardware, software, network, local and school library, field trip
support etc.) of my school makes it easy for me to implement PBL.” Nineteen teachers agreed, while 1 disagreed
and 1 was unsure. We asked the teachers in the interview sample what other resources and help they needed in
implementing PBL. Some teachers wanted opportunities to team teach. Others stated that they could use more
preparation time.
380
implementation of technology-enhanced PBL. Others have identified vision sharing as essential to technology
integration and the implementation of new pedagogical techniques (Anderson & Dexter, 2000; Hunter, 2001; Jukes,
1996). “When the vision is not shared, teachers often view the plan as just another example of rhetoric rather than a
substantive commitment to a plan” (Jukes, 1996, p. 14).
Our study also indicated that feedback and expectations were a major barrier to the design and
implementation of PBL units. Many of the teachers that we interviewed mentioned that they never received
feedback based on their use of PBL. Schaffer and Richardson (2004) also found that insufficient feedback relative to
expectations was one of major barriers to technology integration in the K-12 classroom. That is, teachers need
regular, corrective, feedback especially when they implement new teaching methods (Scheeler, Ruhl, & McAfee,
2004; Spencer & Logan, 2003).
The results of this study illustrated the importance of sharing vision, detailing expectations, and providing
feedback to support teachers as they implement technology-enhanced PBL. Many schools focus on acquisition of
technology (hardware and software) instead of sharing a vision of technology-enhanced PBL or providing any
feedback and expectation about teachers’ performance in implementing technology-enhanced PBL. Therefore, it is
unreasonable to expect teachers’ performance to improve because some teachers are confused about their roles and
want to know both what the school expects from them in classrooms and whether they were performing in line with
expectations. Therefore, a systemic support structure is needed to help teachers make meaningful uses of
technology.
We recommend sharing the vision of technology-enhanced PBL with teachers (i.e., school strategic plans)
and providing increased opportunities for collaboration among teachers such as the development of joint units, peer
coaching, and mentoring. Teachers should be encouraged to develop joint units with other teachers and get
consistent feedback from each other. Collaboration with a peer is essential to teachers’ implementing PBL.
References
Anderson, R. E., & Dexter, S. L. (2000). School technology leadership: Incidence and Impact. Teaching, learning,
and computing: Incidence and impact: 1998 national survey, report #6. [Eric Document Reproduction
Service No. ED 449786].
Arnau, L., Kahrs, J., & Kruskamp, B. (2004). Peer coaching: Veteran high school teachers take the lead on learning.
NASSP Bulletin, 88, 26-41.
Becker, H. J., & Riel, M. M. (1999). Teacher professionalism, school work culture and the emergence of
constructivist-compatible pedagogies [PDF file]. Center for Research on Information Technology and
Organizations. Retrieved October 2, 2002, from http://www.crito.uci.edu/tlc
Berg, S., Benz, C. R., Lasley II, T. J., & Raisch, C. D. (1998). Exemplary technology
use in elementary classrooms. Journal of Research on Computing in Education, 31, 111-122.
Brinkerhoff, J. & Glazewski, K. (2004). Support of expert and novice teachers within a technology enhances
problem-based learning unit: A case study. International Journal of Learning Technology, 1(2), 219-230.
Brush, T. & Saye, J. (2000). Design, implementation, and evaluation of student-centered learning: A case study.
Educational Technology Research and Development, 48(2), 79-100.
Ertmer, P. A., Addison, P., Lane, M., Ross, E., & Woods, D. (1999). Examining teachers' beliefs about the role of
technology in the elementary classroom. Journal of Research on Computing in Education, 32, 54-71.
Ertmer, P. A., Lehman, J., Park, S. H., Cramer, J., & Grove, K. (2003). Barriers to
teachers’ adoption and use of technology in the problem-based learning. Proceedings of Association for the
Advancement of Computing in Education (AACE) Society for Information Technology and Teacher
Education (SITE) International Conference, 1761-1766.
Hmelo-Silver, C. E. (2004). Problem-based learning: What and how do students learn? Educational Psychology
Review, 16, 235-266.
Hunter, B. (2001). Against the odds: Professional development and innovation under less-than-ideal conditions
[Electronic version]. Journal of Technology and Teacher Education, 9(4), 473-496.
Jonassen, D., Howland J., Moore, J., & Marra, R. (2003). Learning to solve problems with technology: A
constructivist perspective (2nd ed.). Upper Saddle River, NJ: Merrill Prentice Hall.
Jukes, I. (1996, April). The essential steps of technology planning. School Administrator, 53, 8-14.
381
Land, S. M. (2000). Cognitive requirements for learning with open-ended learning environments. Educational
Technology Research and Development, 48(3), 61-78.
Monahan, T. C. (1996). Do contemporary incentives and rewards perpetuate outdated forms of professional
development? Journal of Staff Development, 17, 44-47.
Park, S. H., Cramer, J., & Ertmer, P.A (October, 2004). Implementation of a technology-enhanced problem-based
learning curriculum: A year-long case study. Proceedings of the Association for Educational
Communications Technology Conference (AECT).
Robinson, D. G., & Robinson, J. C. (1995). Performance consulting: Moving beyond
training, San Francisco: Berrett-Koehler Publishers, Inc.
Sage, S. M. (2000). A natural fit: Problem-based learning and technology standards.
Learning & Leading with Technology, 28(1), 6-12.
Schaffer, S. P., Richardson, J., & Park, S. H. (October, 2003). Performance support
systems approach to identifying barriers to technology integration. Proceedings of the Association for
Educational Communications Technology Conference (AECT). 329-335.
Schaffer, S. P., & Richardson, J. C. (2004). Supporting technology integration within a teacher education system.
Journal of Educational Computing Research, 31, 423-435.
Scheeler, M. C., Ruhl, K. L., & McAfee, J. K. (2004). Providing performance feedback to teachers: A review
[Electronic version]. Teacher Education and Special Education, 27 (4), 396-407.
Spencer, S. S., & Logan, K. R. (2003). Bridging the gap: A school based staff development model that bridges the
gap from research to practice [Electronic version]. Teacher Education and Special Education, 26 (1), 51-
62.
Stepien, W. J., Gallagher, S. A., & Workman, D. (1993). PBL for traditional and interdisciplinary classrooms.
Journal for the Education of the Gifted, 16, 338-357.
Wedman, J. F. & Graham, S. W. (2004). Welcome to the performance pyramid.
Columbia, MO. Retrieved Sep 9, 2004 from http://tiger.coe.missouri.edu/~pyramid.
382
Appendices
Appendix A. Survey
1. How many different PBL units have you implemented during your class time or reading time since fall 2000?
___ None ___1-2 units ___3-4 units __ more than 5
2. Have you taken PBL workshop or Purdue classes offered at Crawfordsville since fall 2000? If so, when and
how many?
Directions: Decide if a statement is totally true, if you are unsure, or if a statement is completely untrue; then circle
the appropriate letter.
4.
I have been given enough time to plan and implement PBL. T F U
5. The physical environment and tools (hardware, software, network, local and
school library, field trip support etc.) of my school makes it easy for me to T F U
implement PBL.
6.
There are rewards and incentives for PBL implementation in my school. T F U
7.
I am motivated to implement PBL in my classes. T F U
8. I have the physical and mental capacity to plan, design, and manage PBL in my
T F U
classroom.
9
I have the knowledge and skills needed to implement PBL. T F U
3. Please describe any barriers you face in planning and implementing PBL (use the reverse side if necessary)
383
Foundations for Transforming Education
Charles M. Reigeluth
Indiana University
Public education in the United States is an array of highly complex systems whose results have proven
difficult to predict or control. Similarly, the process of transforming a school system is highly complex and
difficult to predict or control. Chaos theory and the sciences of complexity (Kellert, 1993; Wheatley, 1999)
were developed to help understand highly complex systems. They recognize that beneath the apparently
chaotic behavior of a complex system lie certain patterns that can help one to both understand and influence
the behavior of the system. This paper begins with a summary of some of the key features of chaos theory
and the sciences of complexity and then explores the ways that these theories can inform the systemic
transformation of K-12 education in the United States.
Co-evolution
For a system to be healthy, it must co-evolve with its environment: it changes in response to
changes in its environment, and its environment changes in response to its changes. Wheatley says, “We
inhabit a world that co-evolves as we interact with it. This world is impossible to pin down, constantly
changing ….” (Wheatley, 1999, p. 9). A K-12 educational system exists in a community and larger society
that are constantly evolving. But how are they evolving? Toffler (1980) has identified three major waves
of societal evolution. Each has been accompanied by a major changes in our educational systems, and
collectively they provide us with examples of co-evolution between educational systems and their
environments. During the agrarian age, the one-room schoolhouse was the predominant paradigm of
education, with its focus on tutoring and apprenticeship. During the industrial age, the factory model of
schools became the predominant paradigm of education, with its focus on standardization and teacher-
centered learning. Now, as we evolve ever deeper into the information age, society is undergoing just as
dramatic a change as during the industrial revolution, and this is putting great pressure on our educational
systems to co-evolve in major ways.
As the pace of changes in our communities and society has been increasing, the need for co-
evolution in education has become ever more urgent. Banathy (1991) has pointed to a large co-
evolutionary imbalance between education and society, which places our society in ill-health and peril.
Schlechty (1990), Caine and Caine (1997) and others have pointed out that our educational systems are
doing a better job than ever at what they were designed to do, but that our society is increasingly calling on
them to do things they were not designed to do.
To identify how an educational system should co-evolve, one issue we must look at is how its
environment has changed. This includes changes in the community’s educational needs, in the tools it
offers to educators, and in other community (and societal) conditions that impact education, such as drugs,
violence, teen pregnancy, and latch-key children. However, an educational system is not just shaped by its
community; it also helps shape its community. Thus, another issue for identifying how an educational
system should co-evolve is the ways the community would like its educational system to change to better
shape the community. Those ways are heavily based on the values, beliefs, and visions of the community.
384
points:
“I observed the search for organizational equilibrium as a sure path to institutional death.” (p. 76).
“In venerating equilibrium, we have blinded ourselves to the processes that foster life.” (p. 77).
“To stay viable, open systems maintain a state of non-equilibrium…. They participate in an open
exchange with their world, using what is there for their own growth.” (p. 78).
“Prigogine’s work demonstrated that disequilibrium is the necessary condition for a system’s
growth.” (p. 79).
Hence, disequilibrium is one important condition for co-evolution. The other is positive
feedback. Systems may receive both negative and positive feedback. Negative feedback provides
information about deficiencies in attaining a system’s goals so that the system can adjust its processes to
overcome those deficiencies. In contrast, positive feedback provides information about opportunities for a
system to change the goals that it pursues. Thus, positive feedback is information from the environment
that helps a system to co-evolve with its environment. Often it takes the form of perturbances (or
disturbances) that cause disequilibrium in a system.
Perturbance
A perturbance is any change in a system’s environment that causes disequilibrium in a system.
For example, as our society in the United States has evolved into the information age, a new educational
need that has arisen is the need for life-long learning. Rapid change in the workplace and the new reality of
multiple careers during one’s life require people to be life-long learners. To help people become life-long
learners, schools must cultivate both the desire to learn (a love of learning) and the skills to learn (self-
directed learning). However, our typical industrial-age school systems do the opposite on both counts,
placing stress on the environment (co-evolutionary imbalance) and causing the environment to put pressure
(perturbance) on the educational system to undergo fundamental change, or transformation.
Transformation
Disequilibrium creates a state in which the system is ripe for transformation, which is
reorganization on a higher level of complexity. Transformation occurs through a process called
“emergence,” by which new processes and structures emerge to replace old ones in a system.
Transformation is in contrast to piecemeal change, which entails changing one part of a system without
changing other parts or the way the parts are organized (the structure of the system). According to Duffy,
Rogerson and Blick (2000), transformation of an educational system requires simultaneous changes in the
core work processes (teaching and learning), the social architecture of the system (culture and
communications), and the system’s relationships with its environment.
385
structures that emerge in a system undergoing transformation. Fractals are similar to what Dawkins called
“memes,” which are ideas or cultural beliefs that are “the social counterpoints to genes in the physical
organism” and have the power to organize a system in a specific way (Caine & Caine, 1997, p. 33). One
example of a strange attractor, or meme, in education is empowerment/ownership, which entails providing
both the freedom to make decisions and support for making and acting on those decisions. On the district
level this takes the form of the school board and superintendent empowering each building principal to
experiment with and adopt new approaches to better meet students’ needs and to make other important
decisions (hiring, budgeting, etc.). On the building level the principal empowers each teacher to
experiment with and adopt new approaches to better meet students’ needs and to participate in school
policymaking and decision making. On the classroom level the teacher empowers each student to make
decisions about how to best meet her or his needs. This form of leadership at all levels entails providing
guidance and support to cultivate the ability to make good decisions and act effectively on them.
A second example of a strange attractor is customization/differentiation (or diversity). On the
district level, each school has the freedom to be different from other schools. On the school level each
teacher has the freedom to be different from other teachers. And on the classroom level each student has
the freedom to be different from other students (with respect to both what to learn and how to learn it). A
third example is shared decision making/collaboration. On the district level the school board and
superintendent involve community members, teachers, and staff in policymaking and decision making. On
the school level the principal involves parents, teachers, and staff in policymaking and decision making.
And on the classroom level the teacher involves the child and parents in decisions and activities to promote
the child’s learning and development.
To become an effective strange attractor for the transformation of a school system, the core ideas
and values (or beliefs) must become fairly widespread cultural norms among the stakeholders most
involved with making the changes. Once that status is reached, very little planning needs to be done for the
transformation to take place. Appropriate behaviors and structures will emerge spontaneously through a
process called self-organization.
Self-Organization
Self-organizing systems are adaptive; they evolve themselves; they are agile (McCarthy, 2003).
They require two major characteristics: openness and self-reference (Wheatley, 1999). To be open with its
environment, a system must actively seek information from its environment and make it widely available
within the system.
The intent of this new information is to keep the system off-balance, alert to how it might need to
change. An open organization doesn’t look for information that makes it feel good, that verifies its past
and validates its present. It is deliberately looking for information that might threaten its stability, knock it
off balance, and open it to growth. (Wheatley, 1999, p. 83)
But the system must go beyond seeking and circulating information from its environment; it must
also partner with its environment. As Wheatley (1999) notes: “Because it partners with its environment,
the system develops increasing autonomy from the environment and also develops new capacities that
make it increasingly resourceful.” (p. 84).
A second characteristic of self-organizing systems is the ability to self-reference on the core ideas,
values, or beliefs that give the organization an identity. In this way, “When the environment shifts and the
system notices that it needs to change, it always changes in such a way that it remains consistent with itself.
… Change is never random; the system will not take off in bizarre new directions.” (Wheatley, 1999, p.
85).
A third characteristic is freedom for people to make their own decisions about changes. Jantsch
(1980) has noted the paradoxical but profound systems dynamic: “The more freedom in self-organization,
the more order” (p. 40, as cited by Wheatley, 1999, p. 87). As long as the freedom is guided by sufficient
self-reference, it will allow changes to occur before a crisis point is reached in the system, thereby creating
greater stability and order. Paradoxically, the system is “less controlling, but more orderly” by being self-
organizing (Wheatley, 1999, p. 87). Typically, co-evolution occurs through self-organization, but complex
system dynamics have a powerful influence on self-organization and any resulting systemic transformation.
Dynamic Complexity
According to Peter Senge, social systems have detail complexity and dynamic complexity. The
nature of dynamic complexity is revealed by Senge (1990):
386
When the same action has dramatically different effects in the short run and the long, there is
dynamic complexity. When an action has one set of consequences locally and a very different set of
consequences in another part of the system, there is dynamic complexity. When obvious interventions
produce nonobvious consequences, there is dynamic complexity. (p. 71)
System dynamics are the web of causal relationships that influence the behavior of a system at all
its various levels. They help us to understand how a change in one part of an educational system is likely
to impact the other parts and the outputs of the system, and to understand how a change in one part of an
educational system is likely to be impacted by the other parts of the system. Dynamic complexity is
captured to some extent by Senge’s “11 laws of the fifth discipline” and his “system archetypes.” The
laws include such general dynamics as:
• The harder you push, the harder the system pushes back.
• The easy way out usually leads back in.
• The cure can be worse than the disease.
• Faster is slower.
• Cause and effect are not closely related in time and space.
• Small changes can produce big results—but the areas of highest leverage are often the least
obvious.
Senge’s (1990) system archetypes include:
“Limits to growth” in which an amplifying process that is put in motion to create a certain result
has a secondary effect (a balancing process) that counters the desired result.
“Shifting the burden” in which the underlying problem is difficult to address, so people address
the symptoms with easier “fixes,” leaving the underlying problem to grow worse unnoticed until it is much
more difficult, if not impossible, to fix.
“Tragedy of the commons” in which a commonly available but limited resource is used to the
extent that it becomes more difficult to obtain, which causes intensification of efforts until the resource is
significantly or entirely depleted.
“Growth and underinvestment” in which growth approaches a limit that can be raised with
additional investment, but if the investment is not rapid nor aggressive enough, growth will be stalled and
the investment will become unnecessary.
“Fixes that fail” in which a fix that is effective in the short run has unforeseen long-term effects
that reduce their effectiveness and require more of the same fix.
Senge’s laws and archetypes identify high-level or general system dynamics, but it is important to
also identify the complex system dynamics at play in a particular educational system. Those dynamics are
complex causal relationships that govern patterns of behavior, explain why piecemeal solutions are failing,
and predict what kinds of solutions may offer higher leverage in transforming a system to better meet
students’ needs.
387
as the Internet or other powerful technologies to support learning). Second, there must also be sufficient
enablers of transformation, which are created by factors inside the system, such as “participatory”
(Schlechty, 1990) or “transformational” leadership (Duffy et al., 2000) (as opposed to the industrial-age
command-and-control form of leadership – or more appropriately, management), and sufficient levels of
trust within and among stakeholder groups, such as the teachers association, administration, school board,
and parents.
System dynamics.
System dynamics are complex sets of causes and effects that are largely probabilistic (a “cause”
increases the chances that an “effect” will take place) and highly interactive (the extent of influence of a
“cause” on an “effect” is strongly influenced by other factors, including other causes). Regarding causes,
system dynamics provide us with an understanding of aspects of the current system that will likely
influence the viability and durability of any given change. For example, we come to learn that high-stakes
tests that focus on lower levels of learning in Bloom’s taxonomy (Bloom, Krathwohl, & Masia, 1956) are
likely to reduce the viability and durability of attempts by teachers to develop higher-order thinking skills,
because such efforts will necessarily reduce the amount of time the teachers spend on the lower-level
content, causing a decline in the high-stakes test scores. Regarding the effects of any given change, system
dynamics provide us with the ability to predict what effects the change is likely to have on the outcomes of
the transformed educational system, such as levels of student learning. For example, as the Saturn School
of Tomorrow found (Bennett & King, 1991), allowing students to do what they want when they want can
cause a reduction in “time on task” to learn the important skills and understandings, resulting in a reduction
in learning.
388
standard of attainment, so that they may continue to work on a standard until it has been met. The current
report card, with its list of courses and comparative grades, could be replaced by an “inventory of
attainments” that are checked off as they are reached by each student. This one change could exert
leverage on other parts of the system, most notably the way teaching and learning occur in the classroom,
that might be more powerful than the forces that the rest of the system would place on student assessment
to change back. Furthermore, if appropriate strange attractors have been developed (e.g., enough
stakeholders have evolved their mental models to encompass the belief that student assessment should be
designed to inform learning rather than to compare students with each other), those strange attractors will
create a powerful force in support of such a compatible leverage point and against those aspects of the
current system that would otherwise be working to change the assessment system back to what it was.
Conclusion
An understanding of chaos theory and the sciences of complexity is crucial to systemic
transformation of our educational systems to better meet the rapidly changing needs of our children and
communities. Helpful concepts include co-evolution, disequilibrium, positive feedback, perturbance,
transformation, fractals, strange attractors, self-organization, and dynamic complexity. These concepts can
help us to understand (a) when a system is ready for transformation, and (b) the system dynamics that are
likely to influence individual changes we try to make and the effects of those changes. Furthermore, chaos
theory and the sciences of complexity can help us to understand and improve the transformation process as
a complex system that educational systems use to transform themselves. Strange attractors and leverage
points are particularly important to help our educational systems to correct the dangerous evolutionary
imbalance that currently exists.
References
Banathy, B. H. (1991). Systems design of education: A journey to create the future. Englewood Cliffs, N.J.:
Educational Technology Publications.
Banathy, B. H. (1996). Designing social systems in a changing world. New York: Plenum Press.
Bennett, D. A., & King, D. T. (1991). The Saturn School of Tomorrow. Educational Leadership, 48(8), 41.
Bloom, B. S., Krathwohl, D. R., & Masia, B. B. (Eds.). (1956). Taxonomy of educational objectives, the
classification of educational goals. Handbook I: Cognitive domain. New York: David McKay.
Caine, R. N., & Caine, G. (1997). Education on the edge of possibility. Alexandria, Va.: ASCD.
Duffy, F. M., Rogerson, L. G., & Blick, C. (2000). Redesigning America's schools: A systems approach to
improvement. Norwood, Mass.: Christopher-Gordon Publishers.
Jantsch, E. (1980). The Self-Organizing Universe. Oxford: Pergamon.
Kellert, S. H. (1993). In the wake of chaos: Unpredictable order in dynamical systems. Chicago: University
of Chicago Press.
McCarthy, M. P. (2003). Agile business for fragile times: Strategies for enhancing competitive resiliency
and stakeholder trust. New York: McGraw-Hill.
Reigeluth, C. M. (1993). Principles of educational systems design. International Journal of Educational
Research, 19(2), 117-131.
Schlechty, P. C. (1990). Schools for the twenty-first century: Leadership imperatives for educational
reform (1st ed.). San Francisco: Jossey-Bass Publishers.
Senge, P. M. (1990). The fifth discipline: The art and practice of the learning organization (1st ed.). New
York: Doubleday.
Toffler, A. (1980). The Third Wave. New York: Bantam Books.
Tyack, D. B., & Cuban, L. (1995). Tinkering toward utopia: A century of public school reform. Cambridge,
Mass.: Harvard University Press.
Wheatley, M. J. (1999). Leadership and the new science: Discovering order in a chaotic world. San
Francisco: Berrett-Koehler Publishers.
389
Using ASP for Web Survey Data Collection and an Empirical
Assessment of the Web Data
Xuejun Shen
Stanford University
Theoretical Background
With the development of technology, the WWW has increasingly been used as a medium to
collect survey data for research in the social sciences, including the field of education. As a type of
electronic surveys, the Web-based survey shares the advantages of saving cost and transition time over
traditional paper-and-pencil surveys (Mavis & Brocato, 1998). In addition, the Web-based survey allows a
wide variety of graphics, sound and response options; as well as automatic data entry into the database
(Shannon et al., 2002).
On the other hand, electronic surveys may result in limited population and sample due to the
computer proficiency and access to the Web facilities required on the participant part to complete the
survey (Scantron Corporation, n.d.). There have been concerns particularly about the overrepresentation of
males and young people using the Web (Yun & Trumbo, 2000). Furthermore, the advanced Web
knowledge and skills involved in developing the Web-based survey has been referred to as an obstacle for
applying the technology into survey studies (Shannon et al., 2002).
There have been reports of higher as well as lower response rates for electronic surveys than
paper-and-pencil surveys. Multiple contacts, personalization, mixed mode, and incentives have been
reported effective in raising response rates (Shaefer & Dillman, 1998; Mehta & Sivadas, 1995).
Nonresponse rate has often been used as an indicator in comparing the quality of Web data and
paper-and-pencil data (King & Miles, 1995; Sproull, 1986). In addition, Stanton (1998) used variability as a
390
measure of data quality. Although Stanton acknowledged the measure of variability is equivocal, he
justified his approach based on the grounds that (a) “restriction of range in item responses can suppress
item intercorrelations” (p. 713) and (b) unmotivated respondents tend to scramble the same choices across
survey items.
Methods
Sampling
The survey data were collected to explore Idaho secondary school principals’ perceptions on high-
stakes accountability. The sample of 100 was randomly selected from the Idaho State Public Secondary
School Principals Contact List available at the Web site of the Idaho State Department of Education
(ISDOE). The sample’s email and mailing addresses at work were available on the list.
Instrumentation
The survey instrument specifically addressed three dimensions of high-stakes acrateability
regarding the uses of standardized test scores to make decisions about students and schools and the effects
of high-stakes testing on instruction, resulting in three scales: HSA vs. Students, HSA vs. Instruction and
HSA vs. Schools. The instrument consisted of nine demographic questions asking about participants’
demographics including gender, age, educational level, years of experience of teaching, educational
administration and principalship, school size, school type, and school location; and 15 items asking
participants to rate statements on a 1-4 Likert scale. The coefficients for inter-item consistency --
Cronbach’s alphas--were estimated above .60 for each scale.
The predominate Web-based survey medium was adopted mainly out of considerations of
participants’ educational levels at master’s degree or above, and access to the computer and the Internet at
their workplaces. All three pilot respondents indicated favor of the Web-based survey over the traditional
paper-and-pencil method, and completed the survey online.
391
in the letter that principals might complete the surveys in whatever way they preferred: by the Web, mail or
fax. We collected 36 survey responses through the Web during the first two weeks after the initial request
was sent via email; and 16 responses by mail, 13 by fax, and 10 by the Web after the follow-up letter was
sent (these 39 responses are referred to “follow-up data” hereafter ). For testing Hypothesis 1, statistics
were run on both the full set of data collected from 75 principals, and the subset of follow-up data collected
when the subjects were offered multiple response methods. Only the full set of data were used for testing
Hypotheses 2-4 given its larger sample size. We combined the faxed and mailed responses as “paper-and-
pencil data” versus those received through the Web as the “Web data.” A total of 75 survey responses were
collected out of the 100 sampled Idaho principals, making the response rate 75%. Figure 1 illustrates the
frequencies of survey responses by response mode and date after the follow-up letters were sent to the
nonresponding subjects.
Results
Hypothesis 1 states that survey data collected through the WWW and the data collected via the
paper-and-pencil medium did not differ in the respondent demographics. For either the full dataset or
follow-up data, the Pearson Chi-Square tests revealed no significant differences between the Web data and
the paper-and-pencil data in terms of all nine demographic variables, p<.05. Tables 1-4 cross-tabulate the
frequencies of Web and paper-and-pencil surveys against the key demographic variables of gender, age
group, and school location for the full dataset as well as the subset of follow-up data.
After 36
survey
responses 4 Letter
were
Fax
collected via
Web
the Web 4/9-
23/2002, two 3
weeks after
the initial
email
2
request was
sent.
A follow-up 1
letter was
sent 4/23
offering
0
multiple
26
27
28
29
30
10
11
12
13
14
15
16
17
18
19
20
1
2
3
4
5
6
7
8
9
response
5/
5/
5/
5/
5/
5/
5/
5/
5/
4/
4/
4/
4/
4/
5/
5/
5/
5/
5/
5/
5/
5/
5/
5/
5/
modes.
392
Table 1 Frequencies, Percentages, and Pearson Chi-Square Tests Comparing Survey Collecting Methods
Relative to Respondent Gender for Full (N=75) dataset and Follow-up (N=39) subset
Responding
Data Male Female Total Pearson Chi-Square
Method
Web Follow-up 9 / 90.0% 1 / 10.0% 10 / 100% X2(df=1) = .02, p= .695
Full 36 / 78.3% 10 / 21.7% 46 / 100% X2(df=1) = 1.31, p= .252
Paper-Pencil Follow-up / 23 / 88.5% 3 / 11.5% 26 / 100%
Full
Table 2 Frequencies, Percentages, and Pearson Chi-Square Tests Comparing Survey Collecting Methods
Relative to Respondent Age for Full (N=75) dataset and Follow-up (N=39) subset
Responding Pearson Chi-
Data 30-40 40-50 Above 50 Total
Method Square
Web Follow-up 2 / 5.3% 5 / 13.2% 3 / 7.8% 10 / 100% X2(df=3) = 1.26
p= .739
Full 4 / 08.7% 22 / 47.8% 20 / 43.5% 46 / 100% X2(df=3) = 1.40
p= .704
Paper- Follow-up 5 / 17.2% 12 / 41.4% 12 / 41.4% 29 / 100%
Pencil / Full
Table 3 Frequencies, Percentages, and Pearson Chi-Square Tests Comparing Survey Collecting Methods
Relative to School Location and for Full (N=75) dataset and Follow-up (N=39) subset
Urban
Responding Pearson Chi-
Data /Urban Suburban Rural Total
Method Square
Adjacent
Web Follow-up 0 / 0.0% 1 / 10.0% 9 / 90.0% 10 / 100% X2(df=2) = 1.57
p= .457
Full 3 / 6.5% 10 / 21.7% 33 / 71.7% 46 / 100% X2(df=2) = 2.38
p= .304
Paper- Follow-up 4 / 13.8% 3 / 10.3% 22 / 75.9% 29 / 100%
Pencil / Full
Hypothesis 2 states: “The missing data rate did not differ across survey data collection medium.”
All 24 demographic and perception items of the survey were included for rateing the nonresponse rate.
Although the WWW data had smaller missing response rate per person (M = 6.1%, SD = .01) than the
paper-and-pencil data (M = 7.0%, SD = .03), the difference is not statistically significant, t = 1.68, df =
35.90, p = .101, equal variance not assumed.
Hypothesis 3 proposes: “Overall variability did not differ across survey data collection medium.”
The Levene’s test for equality of variance shows that responses to the 25 perception items in general have
equal variances between the Internet and paper-and-pencil data except those for two items on determining
college admission and scholarship based on standardized test results. For these two items, the Levene’s test
values were respectively F = 5.76 and F = 2.16, p < .05. These two items had higher standard deviations for
Internet responses than for paper-and-pencil responses.
Hypothesis 4 states: “The mean response values did not differ across survey data collection
medium.” Most of the survey items were aggregated into three scales, as stated, for examination of
393
response values. As shown in Table 4, the alpha levels of all three scales were acceptable—at or above
.60—for the survey responses collected either through the Web or the paper-and-pencil methods.
Table 4 Scale Titles, Item Numbers and Alpha Reliabilities for Full Survey Responses Collected Through
Web Versus Paper-and-Pencil Methods (N=75)
Alpha Reliabilities
Scale Titles Number of Items Web Data Paper-and-Pencil Data
HSA vs. Students 4 .62 .78
HSA vs. Instruction 6 .60 .63
HSA vs. Schools 5 .89 .84
Table 5 displays means and standard deviations of the full set of Web data and the paper-and-
pencil data; and the results of the t tests comparing their means. As revealed by the t tests, there were no
significant differences between the Internet data and the paper-and-pencil data in all three scales, p < .05.
Table 5 Comparison of the Means of the Scales for Full Survey Data Collected through Web Versus
Paper-and-Pencil Methods (N=75)
Scale Titles Web Data Paper-and-Pencil Data t test Results
HSA vs. Students M = 2.66, SD = .56 M = 2.92, SD = .50 t(df=71) = -2.08
HSA vs. Instruction M = 2.73, SD = .40 M = 2.73, SD = .37 t(df=70) = .05
HSA vs. Schools M = 3.11, SD = .60 M = 3.10, SD = .51 t(df=73) = .11
Note: All the t tests were for independent samples with unequal variances.
Discussion
The study results indicated that, as compared with the survey responses collected through the
paper-and-pencil method, the Web survey data shared similar respondent demographics of gender and age,
missing data rate, overall variability and mean response values. This study supported that the Web survey
responses were comparable to the paper-and-pencil data in the data quality, values and demographic
representativeness among school principals. This allowed aggregation of the two sets of survey data for
further response analysis ignoring the medium via which the data were collected.
The number of respondents who responded via the Web was 1.6 times as large as that of the
surveys collected through the traditional paper-and-pencil methods. This rate might not be the same if the
respondents were offered choices of Web or traditional methods to complete the survey from the initial
request. However, we received from the respondents more positive comments on the Web survey method
than on the mail/fax method as we implemented the pilot and final survey.
In reflection of the survey experience, in addition to the possible factor of participants’ interest in
the survey content, the researchers would attribute the high response rate to the close follow-ups, and the
offer of multiple methods—the WWW, mail or fax—for completing and submitting the survey. The access
control of the Web survey was vital to the validity and credibility of data collection, and assigning
exclusive passwords to participants enabled the researchers to send follow-ups specifically to those who
had not responded. The last follow-up through the phone allowed individualized contacts with participants,
and inquiry about their preference of the medium in completing the survey. The flow of the media
employed to reach participants, from the Web, phone to mail, exemplified the notion suggested by Shaefer
and Dillman (1998) that “researchers can begin with an e-mail approach and use progressively more
expensive methods for nonrespondents until an acceptable response level is reached” (p. 3).
A lesson that could apparently be drawn from this survey data collection experience was to take
into acrate possible filtering systems installed on participants’ computers when designing the Web survey,
especially when participants are likely to complete the survey through computer network facilities at their
workplaces.
References
King, W. C., & Miles, E. W. (1995). A quasi-experimental assessment of the effect of computerizing
394
noncognitive paper-and-pencil measurements: A test of measurement equivalence. Journal of
Applied Psychology, 80, 643-651.
Mavis, B. E., & Brocato, J. J. (1998). Postal surveys versus electronic mail surveys: The tortoise and the
hare revisited. Evaluation and the Health Professions, 21(3), 395-408.
Metha, R., & Sivadas, E. (1995). Comparing response rates and response content in mail versus electronic
mail surveys. Journal of the Market Research Society, 37(4), 429-439.
Scantron Corporation. (n.d.). What are some advantages of electronic surveys? Retrieved January 22, 2002,
from http://www.elisten.com/FAQ/advantages.html
Schaefer, D. R., & Dillman, D. A. (1998). Development of a standard e-mail methodology: Results of an
experiment. Paper presented at the American Association for Public Opinion Research, St Louis.
Retrieved November 5, 2001, from http://survey.sesrc.wsu.edu/ dillman/papers/E-Mailppr.pdf
Shannon, D. M., Johnson, T. E., Searcy, S., & Lott, A. (2002). Using electronic surveys: Advice from
survey professionals. Practical Assessment, Research & Evaluation, 18(1). Retrieved March, 2,
2003, from http://edresearch.org/pare/getvn.asp?v=8&n=1
Sproull, L. S. (1986). Using electronic mail for data collection in organizational research. Academy of
Management Journal, 29(1), 156-169.
Stanton, J. M. (1998). An empirical assessment of data collection using the Internet. Personnel Psychology,
51(3), 709-717.
Yun, G. W., & Trumbo, C.W. (2000). Comparative response to a survey executed by post, e-mail, & Web
form. Journal of Computer Mediated Communication, 6(1). Retrieved December 22, 2001, from
http://www/ascusc.org/jcmc/vol6/issue1/yun.html
395
Taking Statistics Doesn’t have to be Scary: Keeping the Heartrate
Down
HeeGyoung Song
John R. Slate
University of Missouri-Kansas City
Abstract.
In this paper, the authors will examine how a well-developed graduate-level statistics course that
uses online technologies scaffolds students who are in M.A. degree and Ph.D. programs to become less
anxious and to become more motivated towards developing statistical and technological skills and
knowledge. The significant emphases upon which these authors will focus upon are the following instructor
characteristics: 1) Ability to communicate statistics at a level that students can understand; 2) Desire to
provide the students with quality online learning materials; 3) Multiple teaching strategies; 4)
Interpersonal skills for interaction with students; 5) Ability to use technology; and, 6) Dedication to
provide online and offline feedback to students. These six factors result in empowering students to learn
and reducing their statistics anxiety while preparing for three exams and six assignments with the
instructor’s guidance.
Introduction
In this paper, these authors demonstrated how a teacher’s well-developed online curriculum using
technology and interpersonal skills with students in a Statistics Methods I class, and quality learning
materials can influence students’ motivation (Stipek, 2002), reduce their statistics anxiety (Onwuegbuzie &
Wilson, 2003; Widmer & Chavez, 1986), and empower them to participate and interact (Dewey, 1997, p.
340). A thoughtfully designed class structures the process of learning new technology while fostering
“social interaction” (Vygotsky, 1981, p. 145; Wetsch, 1981, p. 190) and knowledge of statistical content.
As a student, the first author was deeply impressed by the teacher’s skills in helping students construct
“knowledge” (Barab & Kirshner, 2001, p. 5) through their “activity” (Leontiev, 1974-75, p. 10) as a unit of
life mediated by mental reflection in Vygotskyian formulations (Robbins, 2003, pp. 55-58) and
“reflexivity” (p.76) rather than “taking objects in from outside” (Bereiter, 2002, p. 20) in using
technologies as mediation tools.
The teacher’s curriculum included online course documents with information updated on a weekly
basis, assignments, and guidelines to help students complete their assignments successfully and to provide
the conditions to decrease anxiety associated with performing tasks and to increase motivation associated
with both mastery learning skills and performance based on given tasks. Also, the online curriculum
provided three sets of exam preparation exercises to help the students better know which concepts they
needed to review. These materials served to reduce exam anxiety and assisted students in reaching their
goals in this class and further in applying the concepts and ideas to realities (Beins, 1985; Lutsky, 1986).
The three exams given measured the objectives outlined in this course’s online curriculum and contents.
Learning in this Statistical Methods I class was designed to take place with advanced skilled learners
(Vygotsky, 1978), and to help less skilled learners expand their actual developmental level to the zone of
promixal development (Vygotsky, 1978, 1987) both in the classroom and in the computer lab with the
teacher playing the role of coach and facilitator (Bruning, Schraw, & Ronning, 1999, p. 195), who is
always present to guide and scaffold the students and their learning to do their assignments and to
understand what they need to do in their tasks.
Theoretical Framework
This research is based on Vygotsky’s dialectical constructivism in sociocultural theory (Bruning et
al., 1999, pp.196-198) and further is grounded in “Vygotsky’s cultural-historical theory, which differs from
sociocultural theory in the United States” (Robbins, 2003, p. XIII) in an effort to understand how the
students learn in a technology-mediated statistics class. Onwuegbuzie and Wilson’s (2003) comprehensive
literature review related to statistics anxiety was implemented to analyze how the teacher designed this
online curriculum to reduce statistics anxiety and empower the students to engage in doing tasks with
advanced learners and teacher’s help. Based on Vygotsky’s (1978) views of “language and its importance
396
as a social and cognitive tool,” statistics students face the same level of anxiety as a foreign language
student. They feel frightened by the unfamiliar content and by the new language that must be used to learn
and demonstrate mastery of this content.
In this Statistical Methods I class the teacher used technology as mediation tools to aid in
communication and interaction with the students to reduce the four general components of statistics
anxiety: 1) instrument anxiety; 2) content anxiety; 3) interpersonal anxiety; and, 4) failure anxiety
(Onwuegbuzie, 2003, p. 201). This use of technology also helped to scaffold student learning and enhance
student motivation in a graduate level statistics course. As Engestrom (2003) wrote, “human activity is
endlessly multifaceted, mobile, and rich in variations of content and form” (p. 20), the activities that the
teacher designed considered these “multiple intelligences” (Gardner, 1983) that impact the student learning
and statistics anxiety (Onwuegbuzie & Wilson, 2003, p. 198). These activities also provide interaction at
the level at which the students can understand in meaningful learning environments.
These activities, learning processes, and online curriculum using technology show how the
internal aspects are first influenced by the external, logically following the model of “Vygotsky’s dual-
dialectical vision” (Robbins, 2003, p. 5) . Also, from a biological and/or cognitive perspective to
understand interaction between learning and cognitive development, these authors examined the class
activities, online curriculum, and use of technology that include “social interaction” (Lightbown & Spada,
2000, p. 23.) that distinguishes Piaget’s view of language and cognitive development from Vygotsky’s
view. Furthermore, the understanding of human cognitive development in social interaction with skilled
advanced learners’ guidance is explained by “the zone of proximal development that defines those
functions that have not yet matured but are in the process of maturation, functions that will mature
tomorrow but are currently in an embryonic state” (Vygotsky, 1978, p. 86; Robbins, 2003, pp. 28-48).
Technology used to aid the learners’ learning and access to information and the online curriculum
which was designed to address different “learning styles” (Onwuegbuzie, 1998) and cultural factors (Gay,
2000) in multicultural education (Banks & Banks, 2004) may influence student motivation for learning,
reduce statistics anxiety, and finally, scaffold the students to further construct “knowledge” (Berger &
Luckmann, 1967, p. 87).
Methodology
In this study, the first author used qualitative methodology with a case study method (Stake, 1995,
p.xi; Eisenhardt, 2002, p.9; Patton, 2002, p. 297; Yin, 2003, pp. 9-13; Mile and Huberman, 1994, p. 25) to
structure data and to analyze data collected through survey questionnaires, interviews, and classroom
observation. From an emic perspective in this study, the first author, who is a student and researcher,
interpreted and analyzed the data, and through these methods the author will demonstrate validity and
reliability (Maxwell, 2002, p. 48; Merriam, 1998, p. 199). These interpretations were then shared with the
second author, the instructor of the course. From Vygotsky’s cultural-historical theory to activity theory
(Engestrom) and situativity (Barab) in a cognitive perspective, these authors attempted to understand better
how statistics anxiety can be reduced by the teacher’s attitudes and innovative instructional design using
technology. Also, these authors investigated how online curriculum and the appropriate timely online and
offline feedback may impact student motivation.
Participants in this study came from a pool of 50 students in either M.A. or Ph.D. degree
programs. Most of the students who are in the Statistical Methods I class were required to take this graduate
level statistics course in their major emphasis or as background to develop their thesis or dissertation. The
first author, as a researcher and doctoral student in both Education and Urban Leadership and Policy Study
at the University of Missouri-Kansas City in the U.S., enrolled in this course to understand what statistics
is, how data are collected, analyzed, and interpreted, and how research, conducted using quantitative
methodological tools, would be able to be implemented in the real world.
Students and the faculty member met in a classroom that was well-equipped with technological
tools, such as 20 laptop computers, access to the internet and Blackboard at the University, in the front two
large projection screens on which the teacher projected well-designed PowerPoint slides, course
documents, and up-coming dates. Also, the class was a roomy place with spiral-type stairs and individual
chairs and with a good lighting system and ideal temperature adjusted to reduce the students’ anxiety and
stress.
The class met once a week for 2 hours and 45 minutes, for a total of 16 meetings during the
academic 2004 Fall semester. Each meeting time the teacher used a whiteboard to let students know what
to do, turned on the computer to access the university’s Blackboard system, opened the online syllabus to
397
address what to do, and visited course documents to explain what to learn about. The syllabus was designed
to allow students to obtain free quality materials that would support them at their current level of
understanding and beyond (Miltiadou & Savenye, 2003). Because this class did not have required
textbooks, which is different from most traditional classes, students needed to use technological tools to get
resources linked to the internet web sites and to print out PowerPoint slides and handouts that include what
the students should know about. In announcements through the university’s Blackboard site for this course,
the teacher supplied the students with the information that some web sites had disappeared and others had
been updated. Also, before the teacher noticed these changes, some students mentioned the changes on the
web sites of the class in the class and the teacher and later the whole class was informed through
Blackboard.
Data were collected through survey questionnaires and interviews with the students and with the
faculty member (Maxwell, 1996). The goal of data collection was to analyze and interpret the instructor’s
six characteristics from both the students’ perspective and the teacher’s perspective in class and online
experiences and to find how the teacher’s six characteristics resulted in empowering students’ motivation to
learn about, reduce their anxieties in preparing for three exams and six assignments with the instructor’s
guidance.
Data Analysis
Students were asked six open-ended questions on a survey questionnaire (see Appendix A).
Summary statements for each of the questions will be followed with several representative samples of
student comments.
Question 1: In what ways did your anxieties for this course increase or decrease during the semester?
In responding to this question, among 37 participants in this study, 17 students demonstrated that
their anxiety has been decreased with the instructor’s willingness to assist students’ questions or concerns,
quality materials designed to help students understand, instructor’s high but reasonable expectation to the
students, caring learning environments, and positive results of the exams. Ten students answered that
anxiety increased. When the difficulty of topics increased, during the test and when preparing the exams,
dealing with new terminologies and delayed homework assignments, the students said that they felt
increased anxiety. Eight students answered that test and/or homework anxiety existed but these factors were
reduced with the instructor’s attitudes. The others (n=2) pointed out time demands outside class and lots of
terminology that impacted their level of anxiety.
Anxiety decreased (n=17)
The instructor was open to questions and willing to assist
The teacher made the material easy to understand
Anxiety decreased as the content was presented in a detailed and repeatable format
My anxiety decreased because I became more and more comfortable using SPSS and taking Dr.
Slate’s exams
As grades increase, anxieties have decreased
Anxiety increased (n=10)
Anxiety increased as the difficulty of topics increased
During the tests, anxiety increased
Whenever I met new & very unfamiliar terminologies, the level of anxieties increased
I feel more anxiety when I prepared for exam
Test and/or homework anxiety but reduced these with instructor’s attitudes (n=8)
My anxiety increased on the nights of the exams because I was worried I would not remember all
the symbols…but my anxiety were lower during regular class night because Dr. Slate helped me feel that I
could understand statistics.
Anxiety increased due to my minimal computer skills. Anxiety decreased by instructor’s clear
explanation of material and his testing using what he taught and review the material to prepare for the test.
Anxieties increased when an exam was present and when assignments were due. Other than these times my
anxiety was normally low.
Others (n=2)
Time demands outside class
Lots of terminology that was cumulative in nature
In summary, students felt uncomfortable and experienced anxiety. Gradually the instructor’s
willingness and well-structured quality online materials to assist the students’ concern related to statistics
398
per se low-self esteem, exams, and assignments resulted in reducing the students’ anxiety.
Question 2: In what ways did this Stat 1 class impact your motivation for academic achievement?
Concerning this question, 26 students mentioned that statistics class impacted their motivation for
academic success. Specific factors they mentioned were grade points, preparation for future research skills,
the instructor’s prompt assessments and thorough feedback, and the students’ increased self-confidence to
take risks. Three students mentioned that statistics class did not much impact the students’ motivation for
academic success. One student answered that statistics class lowered the students’ motivation for academic
success.
Statistics class impacted the students’ motivation for academic success (n=26)
The grade in the statistics class is what motivated me
Getting a good grade on the first test and on the first SPSS really motivated me to try harder and not give
up
Even though this class was difficult and required a lot of study time I would like to learn more because it
will affect my research skills in the future
Dr. Slate’s prompt assessment and thorough feedback motivated me to study and achieve
I feel that I am comfortable with a subject now that I wasn’t before. I am more willing to take risks
Statistics class did not much impact the students’ motivation for academic success (n=3)
Not much. I like to do well in all classes
Not much. I have a lot of material already
Not much. I only wanted to pass
Statistics class lowered the students’ motivation for academic success (n=1)
My motivation for academic achievement was lowered because of the need to balance course work, family
and full-time employment.
In summary, most students reported they had been influenced by this Statistics 1 class in terms of
academic achievement and motivation in their individual intellectual inquiry.
Question 3: In what ways did the use of technology impact your class performance, including three
exams and six assignments?
Regarding this question, 29 students answered that the use of technology helped and improved
their performance in preparing three exams and six assignments in the ways that the students can easily
follow guidelines the instructor provided on Blackboard, focus on the content rather than keeping up with
notes, and feel confident using the computer and running the SPSS. Five students answered that the use of
technology negatively impacted their performance. One student said that the use of technology did not
impact their performance.
The use of technology helped and improved my performance in preparing three exams and six
assignments (n=29)
I appreciated the way Dr. Slate explained all steps in doing SPSS carefully and thoroughly. Also I
appreciated having the PowerPoint presentations on the Web so I could print them out and take notes on
them. It helped me have all study materials another web as well as the assignments. Helped to keep me
organized
Very easy to use. I could concentrate on the content and not on keeping up with notes I printed them off
Blackboard.
I have more confidence using it and the computer.
It helped me improve my class performance and ability for statistical analysis.
The use of technology negatively impacted my performance (n=5)
I wasn’t very good with the SPSS program and feel it negatively impacted my performance.
It was a real “push” to come to the lab to use SPSS.
Many times I couldn’t remember what things were nominal, ordinal, etc (basic concepts are confusing to
run SPSS).
I am extremely frustrated at the SPSS program and it was hard.
Honestly, I hate technology, but I hate math more.
The use of technology did not impact my performance (n=1)
I was very proficient technology and computers. I have or had a strong aversion to learning the SPSS
program—only because I know I will never use it in anyway in the future. I just wanted to be able to
interpret research.
In summary, negative impacts to use technology in this class indicate that math concepts are
relevant to some students’ attitudes to use SPSS and technology even though majority of students said that
399
the use of technology helped and impacted positively their performance in preparing three exams and six
assignments.
Question 4: Were there any instructional strategies used in this statistics 1 class that you would encourage
other instructors to implement? Please describe.
Under the fourth question, 31 students pointed out significantly greater impacts in online syllabus
design the instructor developed to share with other teachers, such as all materials on the web and well-
constructed PowerPoint slides, additional online resources and use of links on website group discussion,
and posting all information needed for all assignments in a very detailed and logical manner with real life
examples. Three students mentioned the teacher’s expectations and attitudes, such as simple, but clear
expectations, availability of professor for student questions and attitude was amazing, and diligence of
instructor to assist student to understand in their tasks. One student viewed test reviews and another
attributed the instructional strategies used in this class to be the natural giftedness of the professor to deal
with such complicated material. One student irrelevantly answered that she/he did not like Power Point
presentations.
Online syllabus design and benefits for students (n=31)
The use of online material for instruction and posting lecture materials online was very useful
Liked the Power Point presentation of notes so I could “listen” to the lecture and understand to do so
No fixed textbook. Reading links on the blackboard gave me a lot of new ideas
Power Point decks with the important concepts and lectures were provided to all of us as references. This
lessened the need for note taking. We could focus on additive notes and ever you’ve received a strong base
I benefited from the guided practice implemented in the on-campus computer lab
Power point, Blackboard, posting all information needed for all assignments in a step-by-step logical
manner and very detailed with examples from real life.
Teacher’s expectations and attitudes (n=3)
Yes, clear, crystal clear expectations
Simple explanations. Availability of professor for student questions and attitude of professor was amazing
Diligence of instructor make me to actually UNDERSTAND homework instead of just doing it
The instructional strategies (n=2)
Test reviews
The instructional strategies used in this class seemed to be more of a gift that the professor has for dealing
with such complicated material. I do not know if the effectiveness can be mimicked. The instructor
definitely has it
Others (n=1)
In general, I do not like Power Point presentations. They do not do an effective job of conveying
knowledge, when compared to group assignments or other active learning techniques.
In summary, apparently, the instructor’s high but reasonable expectations, caring attitudes, and well-
designed online syllabus and relevant online learning materials were demonstrated as the very important
implemental factors that other teachers are encouraged to apply.
Question 5: Do you think technology scaffolds what you learn in a positive way? Why or why not?”
Under this fifth question, 9 students said that technology scaffolded what they learn in a positive
way, three students said that technology did not scaffold what they learn in a positive way, twenty three
students did not respond to this question because under the fifth question there are several questions I
assume.
No responses (n=23)
Technology scaffolded what I learn in a positive way (n=9)
Makes learners easier with better access to resources
Provides real world application
Easier to communicate and obtain instruction. I love having everything used in this easily accessible
Technology did not scaffold what I learn in a positive way (n=3)
I do not grow up with computers so I found assignments difficult
No
Others (n=2)
I learned a totally foreign software to me.
Yes and no. No in that when you are not sure how to use it. Things get more difficult and time consuming.
In summary, even though 23 students did not answer this question, partly the first author assumes this is
due to the structure of the question including a main question and three sub-question categories. Nine
400
students explicitly stated that technology supported what they learn in a positive way.
Question 5.a. What problems, if any, did you experience accessing Blackboard to obtain online
reading and learning materials and prepare for class activities and exams? Be as specific as possible.
For this question, 31 students said that there were no problems with Blackboard. Four students
expressed that when the school server systems had some problems, they faced problems. The other two
students pointed out effectively using digital internet connection and a personal computer at home.
No problems with accessibility to Blackboard (n=31)
No problems with Blackboard
It is easier to use and faster than my dial up connection at home
No problem experienced
I love having everything used in class easily accessible
Problems with accessibility to Blackboard (n=4)
Only a couple of times when the school server was down
Not being able to get on when server was down
I have a slow modem at home--- long wait time
Others (n=2)
Some of the listing are labeled more clearly than others.
The most important pieces need to use blackboard successfully is a digital internet connection and a
personal computer at home.
In summary, except for technological problems in school server systems and personal computer at home,
the students were able to easily access to the Blackboard system and reach the resources.
Question 5.b. Do you prefer to have required textbooks in this class or online, free materials?
Here, 31 students preferred to have free online materials. One student preferred to have a required
textbook. Two students liked both a required textbook and free online materials. The other students
indicated some elements that may be irrelevant.
I prefer to have online free materials in this class (n=31)
Online free materials
Save trees
I prefer to have a required textbook (n=1)
I prefer to have textbooks because selecting materials among several online resources makes me confused.
I like both (n=2)
I like both. If the book is good I like to retain it as a reference. I like a hard copy and printing out 500-1,000
pages of online material isn’t really free since it takes time and toner.
I think online materials are good, however a book that will explain in better and more details will help a lot
in understanding at the material.
Others (n=3)
It does not matter
I think the required material should be the student SPSS software. Just need to make the purchase process
easier.
In summary, most of the students indicated that they want to have free online materials even
though several students liked both.
Question 5.c. Please describe the “pros” and “cons” of using technologies, such as opening Power
Point slides, printouts, and hands-on calculation.
Responding to this question, 21 students described the pros of using technologies to be easily
obtainable and easy to follow, eight students described both the “pros” and “cons” of using technologies,
pointing out the fact easier and faster and the worry about losing interaction with other students. Five
students described the “cons” of using technologies, such as discomfort in using the technologies and loss
of focus why they do what they do. Other factors are described with necessity of a prior knowledge to use
Power Point.
The “pros” of using technologies (n=21)
Very helpful to have Power Point
Allow more time to listen to instruction
I appreciate and benefit from the visuals to help me organize the information in my head
It was very easy to use the technology
The “cons” of using technologies (n=5)
I think the goal of the “paperless classroom” hampered students’ ability to learn
401
It is difficult when you are relying on technology and some part of it fails
Forces people to use them if not comfortable and good
Both (n=8)
Pros—easy to organize and take notes on. I am so glad we had SPSS and did not have to use EXCELL or
do it by hand. Cons—If I did not sit close I couldn’t see
Pros—easy to access and less papers. Cons must have Power Point at home
Pros: time saver, cheaper, easily accessed. I do not have to carry books on my bike. Cons: I get lazy
Others (n=3)
A prior knowledge to how to work with Power Point is needed before coming to class
In summary, the important factors the pros of using technologies insisted indicated ease in using
technologies, more time spent listening to the instructor, and mapping out with visual tools provided with
technologies. Several factors concerning the cons of using technologies were related to students’ laziness
and accessibility to the technology at home and technical problems in case the server is down.
Question 6: Do you think Blackboard is good as a tool to disseminate information in online or offline
courses? Why or why not?
Here, 32 students answered that the use of Blackboard as a tool to disseminate information in
online or offline courses worked very well. Two students were neutral in this question. Three students were
categorized in others, such as no response and comparing the instructor with others.
The use of Blackboard as a tool to disseminate information in online or office courses worked very
well (n=32)
It was great
I appreciate having easy access to grades and assignments online
Excellent way to communicate with class
Liked that grades and announcements were quickly available
It is a very effective way. I would encourage all the instructions to do so
Neutral (n=2)
It is a tool but no means on all-encompassing one
Continue use as long as sources are not flooded again
Others (n=3)
Too many instructors are unskilled in its use and offer times expect timely posts from students, but not this
instructor. Others fail to stay up to date themselves on grades, etc.
In summary, the use of Blackboard as a tool to disseminate information in online or offline course worked
very well and provided the students with excellent resources in a very effective way
Four subcategories were present under the sixth question.
Question 6.a. how would you describe the instructor’s relationship with the students?
For this question, 35 students stated that the relationship between the instructor and the students
was a very positive one. Only one student answered negatively about the relationship between the
instructor and the students.
The instructor’s relationship with the student is described in a very positive way (n=35)
Professional. Showing personal interest by learning names and asking conversational questions. Focus on
the lesson and start class with small talk concerning family, hobbies, and university issues
Wonderful and great. Very attentive, helpful, patient, enthusiastic, and energetic. Great instructor
Excellent. Receptive and open to changes, always willing to help, a knowledgeable.
The instructor is an asset to UMKC
Make sure all students are understanding the objectives of the course
Very thoughtful and caring
Professional with an approachable way
Excellent-friendly-respond in a positive and timely manner to questions or concerns
The instructor’s relationship with the students is described in a negative way (n=1)
The instructor was a bit over heads as times. Very often the students were confused, but were unable to
think of how to ask their question. Otherwise, he was very friendly, helpful, and gave timely feedback
assignments.
Others (n=1)
In summary, except for one student and the other who did not answer this question, 35 students
viewed the instructor as very thoughtful, open, helpful, knowledgeable, and available in a timely manner.
Even one student who described negatively the relationship between the instructor and the students stated
402
that the instructor was very helpful and friendly.
Question 6.b.1. What correspondence, if any, did you use in this course with the instructor? (e.g.,
email, phone, face-to-face?)
For this item, 22 students corresponded with the instructor through email and face-to-face, ten
students through email, two students through email, face-to-face, and phone, one student through email and
phone, two students through face-to-face.
I corresponded with the instructor through email and face-to-face (n=22)
I corresponded with the instructor through email (n=10)
I corresponded with the instructor through email and face-to-face, and phone (n=2)
I corresponded with the instructor through email and phone (n=1)
I corresponded with the instructor through face-to-face (n=2)
I corresponded with the instructor through phone (n=0)
In summary, most of students corresponded with the instructor through more than one
communication ways.
Question 6.b. 2. If yes, when and how often did you send emails to the instructor? Did this exchange
impact your learning?
Sixteen students answered that they sent emails to the instructor 5-10 times, ten students sent
emails to the instructor 2-3 times, five students marked that they sent emails to the instructor more than 10
times, two students sent emails to the instructor once, and four students indicated others.
I sent emails to the instructor 5-10 times (n=16)
When having trouble assignments, it impacted positively
It was very efficient way of communicating
Email exchange helped me communicate wit the instructor before or after class
I can get responses about my questions as soon as possible
No impact to learning in sending emails
I sent emails to the instructor 2-3 times (n=10)
The instructor was helpful in explaining things and in telling me what I missed on assignments
It impacted my learning by getting my questions answered quickly
Encouraged me to keep trying and that the instructor wanted me to learn
It was very helpful
I sent emails to the instructor more than 10 times (n=5)
Weekly or at least weekly. It had a significant impact in my learning
24 times during semester. Helped with my learning immediately
Great way to facilitate communication
I sent email to the instructor once (n=2)
Only for enrollment issue
Only once it gave me a small amount of information
Others (n=4)
No email exchange
Nothing mentioned to email frequency
In summary, the more frequently students exchanged emails with the instructor, the more positive
were the comments that they made.
Question 6. c. Do you think the instructor communicated effectively with students? Please be specific.
Thirty four students answered that the instructor communicated effectively with students, such as
very clear and concise instruction and nonjudgmental way in class, return of email in a timely manner, very
fluent and very knowledgeable to ideas and concepts, and reasonable expectations, help the students
whenever they are needy. One student answered that the instructor did not communicate effectively with
student. The other two students’ ideas belong to others.
The instructor communicated effectively with students (n=34)
Yes, return of email was prompt
Yes, the instructor was very approachable during and after class
Yes, he is very precise in his communication which helped make things very clear
His help on homework in the lab was terrific
Direct, effective, compassionate and real
The instructor tried to see what my problem was and how he could help
The instructor did not communicate effectively with students (n=1)
403
Very often the students were confused, otherwise the instructor was friendly, helpful, and gave timely
feedback on assignments.
Others (n=2)
I have extremely high math anxiety and I even fainted in class once. However, I think I did o.k. because
this class is more of a logic class to me than a math class.
In summary, the instructor communicated very effectively with the students and assisted the
students to meet their real needs in classroom and a lab in a timely manner and friendly ways.
Findings
Based on the instructor’s six characteristics: 1) Ability to communicate statistics at a level that
students can understand; 2) Desire to provide the students with quality online learning materials; 3)
Multiple teaching strategies; 4) Interpersonal skills for interaction with students; 5) Ability to use
technology; and , 6) Dedication to provide online and offline feedback to students, the data collected by
survey questionnaires and interviews were analyzed. Based on the results of the instructor’s survey (see
Appendix B), most of all, the teacher beliefs about learning were demonstrated to be a very important
factor. This faculty member believed that study skills that include good learners’ characteristics were: 1)
willingness to take risks, 2) a strong drive to communicate with peers and the teacher, 3) asking for help
when the learner needs help, 4) practicing to learn more about things associated with statistics contents and
realties, 5) monitoring progress, and 6) making connections within experiences and interests. These
characteristics can influence students’ motivation and reduce anxiety in the statistics class and increase
students’ self-confidence to complete their six assignments and to pass three exams and finally develop
their performance and master necessary skills. Based on his beliefs about learning, the online curriculum
was designed and quality learning materials were provided to allow the students to engage the content at a
level that they can understand and to communicate with the teacher through technologies and face-to-face
interaction. Also, the teacher focused on helping students understand rather than memorizing formula. The
instructor also placed emphasis on content related to statistics findings and realities in a graduate level
statistics course.
The teacher’s interpersonal skills, including warmth, caring, the ability to listen well, and
availability to students through emails and online dialogues, made the students feel comfortable to
communicate with the teacher and ask for help to clarify what they need to know about in given classroom
tasks and move beyond to apply what they learn to read-world situations.
Even though the class size was large with an enrollment of 50 students, students worked in
collaborative environments with a well-equipped computer lab and a roomy classroom that had 50 laptop
computers available. Ease in accessing quality on-line resources and teacher’s immediate feedback allowed
the students to share what they found to be problematic in completing their assignments and to prepare for
exams. Except for two students, they strongly demonstrated their academic success in this graduate level
statistics course.
Finally, these six factors resulted in empowering students to learn and reducing their statistics
anxiety while preparing for three exams and six assignments with the instructor’s guidance.
Implications
Findings in this study were interpreted to mean that using an online syllabus, online instructional
technologies, and the instructor’s six characteristics can maximize the students’ motivation and reduce
anxieties in a graduate level statistics class. Further research needs to be conducted by using technologies
as a means to better provide the students with instructional online curriculum at levels that students
understand and that synchronically and asynchronically allows the learners to access quality online learning
materials that consider multiple intelligences (Gardner, 1983) and empowers them to “know about” (Barab
& Duffy, 2000, p. 28). Also, technology as a means to mediate the learners’ inner thought into doing
activities should be used in practical collaborative environments and in conditions that consider the levels
at which the teacher and the learners understand one another. As Robbins (2003) wrote that “learning
results through meaningful activity” (p.76) and “the basic components of Russian activity theory are
activity=act=operation” (p. 76) and “ the corresponding conditions are need=motive=goal” (p. 76), the
practical implication the author insists upon is to empower the individual learners toward “self-regulation”
(Robbins, 2003, p. 67) and “self-actualization” (Maslow, 1987 pp. 158-167). These six factors result in
empowering students to learn and reducing their statistics anxiety while preparing for three exams and six
assignments with the instructor’s guidance which stresses the expressive quality of self-actualizing
404
creativeness rather than problem-solving or product-making quality, and this helps learners grow. Finally,
as this research demonstrates, a good teacher communicates in ways that allow the students to see, hear,
and touch and conceive “the world [that] is a symbolic world in the sense that it consists of conceptually
organized, rule-bound belief systems about what exists [different from statistically significant findings],
about how to attain goals, about what is to be valued” (Bruner as cited in Leontiev, 2003, p. 20) The
learners should know what they need know about and what they do not know in socially culturally
constructed learning environments and through meaningful activities. Again, technology is a tool and as
Gunter (2001) stated, to close the teaching and learning technology gap between where we are and where
we need to be in the 21st century, instructional design and curriculum should be focused on preparing the
students to participate in using technologies to learn. In this sense, the study the authors investigated is a
case that demonstrates how technology use and teacher attitudes reduced statistics anxieties and impacted
the student motivation, reflecting a literature review that indicates “only a few researchers have
investigated ways to reducing statistics anxiety” (Onwuegbuzie & Wilson, 2003, p. 202).
References
Banks, J. A., & Banks. C.A.M. (Eds.). (2004). Multicultural education: Issues and perspectives (5th ed.).
Hoboken, NJ: John Willey & Sons, Inc.
Barab, S.A., & Duffy, T. (2000). From practice fields to communities of practice. In D. Jonassen & S.M.
Land (Eds.), Theoretical foundations of learning environments (pp. 25- 56). Mahwah, NJ:
Lawrence Erlbaum Associate, Inc.
Barab, S. A., & Kirshner, D. (2001). Guest editors’ introduction: Rethinking methodology in the learning
sciences. The Journal of the Learning Sciences, 10 (1&2), 5-15.
Beins, B. (1985). Teaching the relevance of statistics through consumer-oriented research. Teaching of
Psychology, 12, 168-9.
Bereiter, C. (2002). Education and mind in the knowledge age. New Jersey. Lawrence Erlbaum Associates.
Berger, P.L., & Luckmann, T. (1967). The social construction of reality. Garden City, NY: Anchor Books,
A Division of Random House, Inc.
Bruning, R., Schraw, G., & Ronning, R. (1999). Cognitive psychology and instruction. Merrill Prentice
Hall. Upper Saddle River, New Jersey.
Dewey, J. (1997). Experience and Education. In S. M. Cahn, Classic and contemporary readings in the
philosophy of education (pp. 325-363). The City University of New York, NY: McGraw-Hill
Companies, Inc.
Eisenhardt, K. M. (2002). Building theories from case study research. In A. Michael Huberman & Matthew
B. Miles, The qualitative researcher’s companion, pp.5-36. Thousand Oaks, CA: Sage.
Engestrom, Y. (2003). Activity theory and individual and social transformation. In Y. Engestrom, R.
Miettinen, and R. L. Punamaki (Ed.), Perspective on activity theory. United Kindom: Cambridge
University Press.
Gardner, H. (1983). Frames of mind: The theory of multiple intelligence. New York: Basic Books.
Gay, G. (2000). Culturally responsive teaching: Theory, research, and practice. New York: Teachers
College press.
Gunter, G. A., (2001). Making a difference: Using emerging technologies and teaching strategies to
restructure an undergraduate technology course for pre-service teachers. Eductional Media
International, 38 (1), 13-20.
Leontiev, A.A. (2003). Vygotsky’s theory:Yesterday, today, and tomorrow. In D. Robbins (2003).
Vygotsky’s and A.A. Leontiev’s semiotics and psycholingustics: Appliocations for education,
second language acquisition, and theories of language (pp. 15-21). London, Westport: Praeger.
Leontiev, A.N. (1974-75). The problem of activity in psychology. Soviet Psychology, 12(2), 4-33.
Lightbown, P. M., & Spada, N. (2000). How languages are learned. Great Clarendon Street, Oxford:
Oxford University Press.
Lutsky, N. (1986). Undergraduate research experience through the analysis of data sets in psychology
courses. Teaching of Psychology, 13, 119-22.
Maslow, A., H. (1987). Motivation and personality. Cambridge, NY: Harper & Row publishers, Inc.
Maxwell, J. A., (1996). Qualitative research design: An interactive approach. Thousand Oaks, London:
Sage Publications.
Maxwell, J. A., (2002). Understanding and validity in qualitative research, in A. M. Huberman & M. B.
Miles, The qualitative researcher’s companion, pp.37-64. Thousand Oaks, CA: Sage.
405
Merriam, S. B. (1998). Qualitative research and case study applications in education: Revised and
expanded from case study research in education. Wiley & Sons, CA: Jossey-Bass Publishers
Miles, M. B., & Huberman, A. M., (1994). Qualitative data analysis. Thousand Oaks, London: Sage
Publications.
Miltiadou, M., & Savenye, W. C., (2003). Applying social cognitive constructs of motivation to enhance
student success in online distance education. AACE Journal, formerly Educational Technology
Review, 11(1), 78-95, retrieved Nov. 17, 2004, from
http://www.aace.org/pubs/etr/issue4/miltiadou2.pdf
Onwuegbuzie, A. J. (1998). Statistics anxiety: a function of learning style? Research in the Schools, 5, pp.
43-52.
Onwuegbuzie, A.J. & Wilson, V. A., (2003). Statistics anxiety: Nature, etiology, antecedents, effects, and
treatments-a comprehensive review of the literature. Teaching in Higher Education, 8(2), 2003.
PP. 195-209.
Patton, M. Q., (2002). Qualitative research and evaluation methods. Thousand Oaks, London: Sage
Publications.
Stake, R. E. (1995). The art of case study research. Thousand Oaks, CA: Sage.
Stipek, D. (2002). Motivation to learn--integrating theory and practice. Allyn and Bacon. Boston.
Robbins, D. (2003). Vygotsky’s and A.A. Leontiev’s semiotics and psycholingustics: Appliocations for
education, second language acquisition, and theories of language. Westport, CN: Praeger.
Vygotsky, L.S. (1978). Mind in society: The developmental of higher psychological processes. M. Cole, V.
John-Steiner, S. Scribner, and E. Souberman (Eds.). Cambridge, MA: Harvard University Press.
Vygotsky, L.S. (1981). The genesis of higher mental functions. In J.V. Wertsch (Ed.), The concept pf
activity in Soviety psychology (pp. 144-188). Armonk, NY: M.E. Sharpe.
Vygotsky, L.S. (1987). In R. Rieber & J. Wollock (Eds.), The collected works of L.S. Vygotsky. Vol. 3:
Problems of the theory and history of psychology. New York: Plenum.
Wertsch, J. V. (1981). The concept of activity in Soviet psychology: An introduction. In James V. Wertsch
(Ed.), The development of higher forms of attention in childhood (pp. 189-240). Armonk, New
York: M.E. Sharpe, Inc.
Widmer, C. C., & Chavez, A., (1986). Helping students overcome statistics anxiety. Educational Horizons,
64(2), 69-72.
Yin, R. K. (2003). Case study research: Design and methods. Applied social research methods series Vo.5.
Thousand Oaks, London: Sage Publications.
406
Configuring graphic organizers to support higher-order thinking skills
Cameron Spears
Gregory Motes
William A. Kealy
University of South Florida
Abstract
Graphic organizers (GO) spatially arrange conceptually related words as a text alternative that
aids comparative and inferential judgments. Past GO research has only studied displays consisting of
words. This experiment, by contrast, studied GOs that employ “retinal variables,” such as size and color,
to nonverbally depict concepts. This strategy, we argue, reduces the cognitive load imposed by a GO by
reducing its number of elements and “offloading” a portion of verbal encoding to visual processing.
Table 1 Mean percentage correct scores and response latencies on comparative and inferential judgments during two trials
following study of a graphic organizer (GO) differing in methods for depicting information
Mental Task
407
Size (16) Accuracy X 0.75 0.83 0.81 0.93
Theoretical Premise
An important consideration in the design of multimedia, and a major concern for the research
reported herein, is how to reduce the “cognitive load” of information displays. Sweller (1988) defines
cognitive load as the demand on mental resources imposed by both the number of elements and the
interrelatedness of these elements for a given task. Since the hallmark of GOs is their ability to
simultaneously present many interrelated concepts, it is conceivable that their cognitive load could be
potentially high. In these instances, one might be able to perform a task by consulting a GO yet learn
nothing from the display (i.e., fail to later recall relationships depicted on the GO) because its high
cognitive load had exhausted available mental resources, leaving none for learning to occur.
The current study explored the possibility of reducing the cognitive load of a GO by substituting
some verbal descriptions with their non-verbal equivalent. Typically, GOs employ only the two “planar”
variables (i.e., the x and y axes) for signaling conceptual relationships through the spatial relationship
among verbal labels. By contrast, our study incorporated “retinal variables” (Bertin, 1983) whereby
variations in the color or size of an icon, for example, replaced the names of colors or numerical values,
respectively. Doing so, we speculated, would diminish cognitive load in two ways. First, this would reduce
the number of elements in the array since one icon could represent color and size while the equivalent
depiction in words would require twice the number of objects. Second, this strategy “offloads” some
linguistic components for imaginal processing (Mayer & Moreno, 2003) resulting in more even distribution
of the mental task between the two components of working memory, the “phonological loop” and the
408
“visuospatial sketchpad” (Baddeley & Hitch, 1974) that respectively handle verbal and visual encoding.
Method
Design and Participants
The study was a 3 Display (Labels vs. Size vs. Size + Color) x 2 Trials x 2 Mental Task
(Comparisons vs. Inferences) factorial design with Display varied between subjects and both Trials and
Mental Task acting as repeated measures. Forty-eight undergraduates enrolled in a sophomore-level
educational technology class volunteered to participate in the study. Those participants who completed the
study received extra credit points toward their course grade. As participants arrived at the study classroom
they were randomly assigned to computer programs that presented one of three versions of GO: a) verbal
labels only (Labels), b) labels plus circles depicting fish size (Size), and c) labels plus colored circles
indicating fish size and coloration (Color).
Materials
Text. Central to this study is a 204-word prose passage that describes several fictitious species of
fish along with several attributes of these fish. This passage, or its precursor, has been used in previous
research on graphic organizers (Kealy, 2000; Kiewra, Kauffman, Robinson, Dubois, & Staley, 1999)
Robinson & Schraw, 1994; Robinson & Skinner, 1996). Fish characteristics related by the text include each
species’ social grouping, size, preferred depth, coloration, and diet. Readability metrics for the passage
include a Flesch Reading Ease of 83.7 and a Flesch-Kincaid Grade level of 4.3.
Displays. The display identified as “Labels” (i.e., the display showing only textual cues) was
derived from one used in a previous study (Kealy, Tada, & von Eberstein, 2000). The Labels display served
as a baseline for the remaining two displays used in the study. The display identified as “Size” built upon
the Labels display (see Figure 2) by using circles to represent the relative size of fish—larger species of
fish were depicted using circles with greater diameter when compared to smaller species of fish. The
display identified as “Color” built upon both the Labels and Size displays, with the added cue of color: the
word representations of color in the Labels and Size displays were replaced with actual colors in the Color
display, whereby the color of a particular circle represented the color of the corresponding fish species.
The study took place in a technology-equipped classroom at a major southeastern university. The
room was outfitted with twenty-five computer workstations, each running the Microsoft Windows XP
Professional operating system. Each workstation was equipped with a 15” (diagonal) flat panel liquid
crystal display (LCD) configured at a video resolution of 1024 x 768 pixels. The three treatments (Labels,
Color, and Size) were installed in equal numbers on twelve of the workstations, with sufficient space
between them to reduce the chance of participants inadvertently viewing an alternative experimental
condition. As an additional preventative measure, the simpler GOs (i.e., the Labels treatments) were on the
front-most workstations, with the more complex GOs (i.e., the Color) treatments) on the rear-most
workstations. The GOs with intermediate complexity (Size) were installed on workstations roughly in the
middle of the room. Hence, with this arrangement we precluded participants from seeing, either
inadvertently or intentionally, a more “interesting” GO on a neighbor’s computer display.
409
Procedures
The prose passage and experimental graphic organizers were integrated into three separate
computer-based treatments by means of the Macromedia Authorware CAI development program. Each of
the three programs provided instructions about GOs and how they are used. Participants then practiced
making two types of judgments by recalling the display information: comparisons (e.g., “which fish is
larger?” [salmon or cod]) and inferences (e.g., “Fish living at greater depths tend to be____” [larger or
smaller]).
Upon being seated at a computer (according to random assignment), participants were presented
with a single field blue display—this display is the Authorware’s “ready” screen. With the aid of a script,
the study proctor gave the participants an overview of what they would be doing. They were then asked to
press the TAB key to start. A brief paragraph of text appeared, giving an overview of the session while also
reminding participants of the voluntary nature of their participation in the study. Each participant was then
prompted by the program to enter his or her gender and major.
Participants were then presented with the first of five instructional screens explaining that they
would have five minutes to study a 200-word passage about different types of fish. The participants were
also told that a small clock icon would be displayed in the upper-left corner of their screens to aid them in
managing their time as they read the passage. Participants then traversed the remaining instructional
screens at a speed convenient for them. The fifth screen informed the participants that they would be
answering comparison and inference questions about the fish passage that they had read. Participants were
presented with examples of comparison and inference questions based on a text about buffalo (see Figure
1) adapted from a study by Robinson, Robinson, and Katayama (1999). Upon completion of this step,
participants studied the experimental fish passage for five minutes. While studying the experimental fish
passage, participants had the option of viewing (by means of a button labeled “View GO”) their particular
graphic organizer as often, and as long, as they chose (subject to the five-minute study period, however).
Immediately after the five-minute study period, three two-column simple addition problems were
presented on the display. The experimental program prompted the participants to confirm the accuracy of
each sum presented by pressing "Y" if correct or "N" if incorrect. This brief interpolated task was intended
to prevent participants’ rehearsal of target information in working memory before the experimental
program administered the criterion measure.
Participants then viewed an exemplar of a comparison question based on a fish characteristic other
than the ones contained within the previously studied text passage. During this portion of the participants’
preparation stage, the lower half of the display contained two side-by-side shaded equal-sized rectangles,
each with a single-word answer in the center. Participants were told to respond to the question shown
("Which fish typically weighs more?") by clicking on the box that represented the correct answer ("Cod" or
"Dolphin" in this example). Once clicked, the video attribute of both the shaded rectangle and enclosed
word was briefly inverted to signal that the experimental program had recorded the response. Next,
participants were presented with an exemplar of an inference-type question (i.e., "Fish that weigh less tend
to have a lifespan") with two response choices, each contained in a separate rectangle ("longer" and
"shorter" in this example).
The program then presented information indicating that an asterisk symbol would appear in the
center of the display for a two-second period immediately before each comparison and inference question
were presented to the participants. Participants were encouraged to make their responses as quickly as
possible, while still striving for accuracy. They were then directed to press the TAB key to begin a brief
practice session in which they could become acquainted with some sample comparison and inference
questions. Once the practice items session was complete, participants were presented with information
indicating that the sample questions were similar to the questions that the participants were about to view.
The instructions encouraged participants to work quickly and to do their best while performing the task.
Participants were then presented with the 30 criterion questions, one at a time. Half of the questions were
comparison-type questions; the other half comprised inference-type questions. The Authorware program
was designed such that the 30 questions were presented in random sequence.
Upon completing the above steps, the participants immediately began a second trial, beginning
with the fictitious passage/GO-viewing segment of the study.
After completing the two trials, participants were asked to rate the helpfulness of several attributes
of the experimental program materials. As each helpfulness rating question was presented, participants
410
were presented with a slider bar which was labeled "0 Unhelpful" at its left end and "5 Helpful" at its right
end. The helpfulness-related questions presented to the participants were: "How helpful was the text for
recalling how fish compared to one another in their characteristics?"; "How helpful was the GO for
recalling how fish compared to one another in their characteristics?"; "How helpful was the GO for
recalling inferences about the way fish characteristics were interrelated?"; "How helpful was the text for
making comparisons between fish in their various characteristics?"; "How helpful was the text for recalling
inferences about the way fish characteristics were interrelated?", "How helpful was the text for forming
inferences about the relationship between fish variables?"; "How helpful was the GO for making
comparisons between fish in their various characteristics?"; and, "How helpful was the GO for forming
inferences about the relationship between fish variables?"
Participants were then presented with a free-form text entry field, at which time participants were
directed to "Please briefly describe any mental tricks or strategies used" by entering their responses in the
text box, then pressing the Enter key.
Upon completion of the steps above, participants were presented with a statement of debriefing
that provided experimenter contact information and also thanked the participants for their participation.
Participants then left their computer workstations, were given their extra course credit vouchers, and left
the study session.
Judgment Performance
Initially, we calculated means and standard deviations for participants’ performance on
comparative and inferential judgments. Table 1 shows the results of these calculations. Mean performance
on question dealing with inferences was, much to our surprise, generally higher than for questions
involving comparative judgment. With the exception of Trial 1performance by those viewing the Color
GO, this was true across all treatment groups during both experimental trials. Also evident from the table
was the global improvement, of roughly ten percentage points, in performance on both criterion measures
from Trial 1 to Trial 2.
We examined the relative differences in judgment accuracy across the treatment groups through a
3 GO Display (Color vs. Labels vs. Size) x 2 Trials x 2 Mental Task (Comparisons vs. Inferences) x 5
Question Type (Color vs. Depth vs. Feeding vs. Grouping vs. Size) repeated measures ANOVA. The
analysis revealed significant main effects for Trials, F(1,45)=38.27, p<.01, d=1.0, Mental Task,
F(1,45)=11.72, p<.01, d=.92, and Question Type, F(4,180)=13.78, p<.01, d=1.0. Additionally, the analysis
reported a significant, F(1,45)=6.13, p=.02, d=.68, Trials x Mental Task interaction, where Trial 1
performance on comparisons (X=.73, SD=.17) and inferences (X=.76, SD=.21) was statistically equivalent
whereas during Trail 2 comparative judgment (X=.79, SD=.14) was significantly lower than inferential
judgment (X=.89, SD=.15).
Our analysis also revealed a Trials x Question Type interaction that was significant,
F(4,180)=2.58, p=.04, d=.72, as well as a significant, Display x Mental Task x Question Type interaction,
F(8,180)=2.10, p=.04, d=.83. The latter, depicted in Figure 3, was particularly interesting to us since it
411
suggested a complex interplay between GO design and the cognitive task for which it is applied. For
example, performance on comparisons dealing with fish color was better when the Size GO was used,
exceeding performance on inference questions. Comparative judgment was also superior to inferential
judgment on questions related to the size of fish. In this instance, however, participants using the Size GO
obtained the highest performance for both types of questions. This display also yielded the highest
performance on questions pertaining to inferences about the socialization of fish while those using the
Color GO answered comparisons about this fish characteristic better.
AC acronym formation
CA categorical assignment
CL counting of letters on the display
CO colors used – observing those
LE letters of alphabet appearing on the display
ME memorized the information provided
RE repetition of the information provided
RL relationships – noting those evident
VC visualizing the chart
x no meaningful response
Certain participant responses in the present study seemed to be novel or qualitatively different
from responses collected in prior research. To accommodate these cases, several new codes were defined
for the present study:
GA game related
KW key words
PA patterns
RS rhyme or song
SA sound-alike words
By visual inspection, it was clear that the most popular strategies overall were “letters of the
alphabet” (LE) and “memorized the information provided” (ME); each of these strategies had fourteen
reported occurrences. The next most frequently occurring strategy was coded as “relationships – noting
those evident” (RL) with twelve reported instances. The number of reported metacognitive strategies
falling into these three categories comprised just over 57% of all reported strategies.
When considering the strategies with respect to treatments, we noted an interesting trend: as the
representative complexity of icons in a treatment increased, so did the number of reported metacognitive
strategies. That is, as icon complexity increased from Labels, to Size, to Color the total number of reported
strategies also increased, with 20, 22, and 28 strategies reported respectively.
As noted above, participants frequently reported using letters of the alphabet (strategy LE) as a
metacognitive strategy. Unlike the visualization and memorization strategies, the letters of the alphabet
strategy was distributed fairly evenly across treatments (4, 4, and 6 for Labels, Size, and Colors
respectively). Representative participant comments included, “I matched the names of the fish together
using the first letters of each name…” or “I would use the first letter of each word of the different fish to
remember whether or not they were solitary, lived in small groups, or lived in a school of fish.”
The most striking outcome of the study was the generally better performance by participants in
answering questions involving inference making versus questions that entailed comparative judgments.
This contradicts the findings of Robinson and Schraw (1994) in which participants performed better using a
GO than just a text, but without the differences between comparative and inferential judgment expected by
the researchers. However, the better performance on inferences over comparisons that was observed in the
412
current study was not evident for all question types. As previously stated, participants were more capable in
making comparisons versus inferences when the question dealt with the size of fish. The fact that this was
especially true among those using the Size GO is intriguing. Conceivably, the capacity of a GO for making
a visual argument is enhanced when it employs retinal variables, such as size and value, that Bertin (1983)
claimed were more effective than others (e.g., color, shape) for representing information.
There are several areas of future research with graphic organizers that have emerged from the
current study. In this study we used a very short text and a corresponding GO that were fictitious; further
research needs to examine the effectiveness of GOs that are adjuncts for longer, more authentic discourse.
Another research issue pertaining to GOs is the timing of their use with respect to an accompanying text. In
most studies on GOs, participants view the display and text on separate occasions. We believe our present
study, in which participants were able to view the display at any time during the text reading, is more
ecologically valid, but does this practice lead to better performance? A similar approach using pop-up
computer graphics (Beâtrancourt & Bisseret, 1998) has been shown to be superior to text with integrated
graphics; possibly a pop-up “GO on demand” would further improve the benefits gained from this class of
graphic displays. Undoubtedly, graphic organizers will continue to be a subject of study for years to come.
References
Baddeley, A. D., & Hitch, G. J. (1974). Working memory. In G. Bower (Ed.), Recent advances in learning
and motivation (Vol. 8, pp. 47–90). New York: Academic Press.
Beâtrancourt, M., & Bisseret, A. (1998). Integrating textual and pictorial information via pop-up windows:
An experimental study. Behaviour & Information Technology, 17(5), 263-273.
Bertin, J. (1983). Semiology of graphics. Madison, WI: The University of Wisconsin Press.
Kealy, W. A. (2000). The role of semantic congruency in the design of graphic organizers. Quarterly
Review of Distance Education, 1(3), 205-214.
Kealy, W. A., Tada, N., & von Eberstein, A. (2000, November). Design and use of graphic organizers.
Paper presented at the National Reading Conference, Scottsdale, AZ.
Kulhavy, R. W., Lee, J. B., & Caterino, L. C. (1985). Conjoint retention of maps and related discourse.
Contemporary Educational Psychology, 10, 28-37.
Larkin, J. H., & Simon, H. A. (1987). Why a diagram is (sometimes) worth ten thousand words. Cognitive
Science, 11, 65-99.
Mayer, R. E, (2001). Multimedia learning. New York: Cambridge University Press.
Mayer, R. E., & Moreno, R. (2003). Nine Ways to Reduce Cognitive Load in Multimedia Learning.
Educational Psychologist, 38(1), 43–52.
Moore, D.W., and Readance, J.E. (1984). A quantitative and qualitative review of graphic organizer
research. Journal of Educational Research, 78, 11–17.
Paivio, A. (1986). Mental representations: A dual coding approach. New York: Oxford University Press.
Rieber, L.P. (1994). Computers, graphics, and learning. Madison, WI: Brown & Benchmark.
Robinson, D. H., & Schraw, G. (1994). Computational efficiency through visual argument: do graphic
organizers communicate relations in text too effectively? Contemporary Educational Psychology,
19(4), 399-415.
Robinson, D. H., Corliss, S. B., Bush, A. M., Bera, S. J., & Tomberlin, T. (2003). Optimal Presentation of
Graphic Organizers and Text: A Case for Large Bites? Educational Technology Research &
Development, 51(4), 5-41.
Robinson, D. H., Katayama, A. D., & Fan, A. C. (1996). Evidence for Conjoint Retention of Information
Encoded from Spatial Adjunct Displays. Contemporary Educational Psychology, 21, 221–239.
Robinson, D. H., Robinson, S. L., & Katayama, A. D. (1999). When words are represented in memory like
pictures: Evidence for spatial encoding of study materials. Contemporary Educational
Psychology, 24, 38-54.
Shah, P., & Hoeffner, J. (2002). Review of graph comprehension research: Implications for instruction.
Educational Psychology Review, 14(1), 47-69.
Snowman, J. (1986). Learning tactics and strategies. In G.D. Phye & T. Andre (Eds.), Cognitive classroom
learning: Understanding, thinking, and problem solving. Orlando, FL: Academic Press.
Spears, C., & Kealy, W. A. (2005, March). Do retinal variables enhance graphic organizers? Paper
presented at the meeting of the Southeastern Conference in Instructional Design and Technology,
Mobile, AL.
Sweller, J. (1988). Cognitive load during problem solving: Effects on learning. Cognitive Science,12, 257-
413
285.
Wainer, H. (1992). Understanding graphs and tables. Educational Researcher, 21(1), 14-23.
414
Investigating Hispanic Pre-Service Teachers’ Self-Efficacy for
Technology Integration on the Subscale Level
Michael Sullivan
Cheng-Chang Pan
Juan Hinojosa
The University of Texas at Brownsville/Texas Southmost College
Background & Introduction
Course Design
The course was created three years ago in order to prepare pre-service teachers to be able to use a
variety of technology tools in both their undergraduate courses and in their student teaching field
experience. The course is not intended as a “one shot” injection of technology skills for future teachers.
The students are exposed to a variety of technology uses in subsequent coursework in the School of
Education at the university, and in their coursework in other campus schools and colleges.
The course objectives closely align with the Texas State Board of Educator Certification (SBEC)
Standards for what future educators need to know in order to be certified and to be effective classroom
teachers. SBEC has delineated 13 competencies for future educators that are labeled “Pedagogy and
Professional Responsibilities.” Number nine of those competencies addresses technology use in the
classroom and reads,
The teacher incorporates the effective use of technology to plan, organize, deliver, and
evaluate instruction for all students.
This statement (competency nine of the SBEC competencies for future teachers) is the terminal objective
for students in the course in which the following study took place.
Research Questions
1. How much does the interaction between Time and Semester affect Self-Efficacy for Technology
Integration on the subscale level?
2. To what extent do four personality traits affect Self-Efficacy for Technology Integration on the
subscale level?
3. What demographics variables can affect Self-Efficacy for Technology Integration on the subscale
level?
Methodology
The quasi-experimental study is intended to explore the change of Hispanic pre-service teachers’
self-efficacy for use of technology in the classroom on a subscale level between the beginning and the end
of each semester during the year of 2004 and 2005.
A total of 172 pre-service teachers participated in this study, which generated a response rate at
90% in the Fall Semester of 2004, 100% in the Spring of 2005, and another 100% in Summer I of 2005. A
traditional in-class questionnaire was adapted and administered from the literature to collect and measure
Self-Efficacy for Technology Integration (Wang, Ertmer, & Newby, 2004), Long and Dziuban Learning
Style Inventory, e.g., personality traits (as cited in Bayston, 2002; Ouellette, 2000), and Demographics
(Pan, 2003) at one time during the week after the add-and-drop day and anther time two weeks prior to the
end of individual semester. A sample question on Self-Efficacy for Technology Integration scale is “I feel
confident about grading technology-based projects.” A sample personality trait on Long and Dziuban
Learning Style Inventory measure is “Phobic,” which is described using the following pointers:
• Thinks of all possibilities and contingencies before venturing into activates,
• “What if…” person,
• May see the negative side of things, and
• Unwilling to take risks.
Using Principal Component Analysis and Varimax with Kaiser Normalization as the extraction method and
the rotation method in SPSS v.13, data reduction results showed that the Self-Efficacy for Technology
Integration scale consisted of three subscales: Self-Efficacy for Clinical Teaching (SECT), Self-Efficacy
for General Use (SEGU), and Self-Efficacy for Responsiveness (SER). Of all the 20 items, 12 were
415
clustered on SECT, 4 were on SEGU, and another 4 on SER. SECT deals with specific classroom settings
where technology is incorporated, like “I feel confident about selecting appropriate technology for
instruction based on curriculum standards.” SEGU is concerned with general technology integration
situations, like “I feel confident that I can maximize computer capabilities in my classroom.” SER
addresses items pertaining to learning contexts for technology responsiveness purposes, such as “I feel
confident I can be responsive to students’ needs during computer use.” These three subscales or factors
were explained by the total variance at 32%, 18%, and 17%, respectively. Concerning internal consistency
testing, a Chronbach reliability alpha was calculated for six datasets (2 times by 3 semesters) with the
highest value of .94 and the lowest value of .76, suggesting each clustered or latent factor was well-
manifested through its contributing variables. Independent variables entailed time, personality trait, and
demographics variables (categorical data); dependent variables included SECT, SEGU, and SER (interval
data).
Results
Question 1 How much does the interaction between Time and Semester affect Self-Efficacy for
Technology Integration on the subscale level?
Thirty cases from each of the six datasets were randomly selected and merged into a separate
dataset, named D1, with a total of 180 cases.
Using SECT as the dependent variable, a two way ANOVA with Time and Semester as the two
levels was performed. No statistically significant interaction effect between Time and Semester was found,
F(2, 174) = .74, p = .49. No statistically significant difference among the group means for Semester was
found, F(2, 174) = .47, p = .63. However, a statistically significant difference between the group means for
Time One and Time Two was found suggesting that our data are unlikely, assuming that the null hypothesis
is true, F(1, 174) = 45.30, p < .001. We therefore reject the null hypothesis in favor of the alternative which
states that a difference exists between the Time means in the population (R² = .20).
Using SEGU as the dependent variable, a two way ANOVA with Time and Semester as the two
levels was performed. No statistically significant interaction effect between Time and Semester was found,
F(2, 174) = 1.81, p = .17. No statistically significant difference among the group means for Semester was
found, F(2, 174) = .46, p = .63. However, a statistically significant difference between the group means for
Time One and Time Two was found suggesting that our data are unlikely, assuming that the null hypothesis
is true, F(1, 174) = 37.96, p < .001. We therefore reject the null hypothesis in favor of the alternative which
states that a different exists between Time means in the population (R² = .18).
Using SER as the dependent variable, a two way ANOVA with Time and Semester as the two
levels was performed. No statistically significant interaction effect between Time and Semester was found,
F(2, 174) = .59, p = .56. No statistically significant difference among the group means for Semester was
found, F(2, 174) = .34, p = .71. However, a statistically significant difference between the group means for
Time One and Time Two was found suggesting that our data are unlikely, assuming that the null hypothesis
is true, F(1, 174) = 25.82, p < .001. We therefore reject the null hypothesis in favor of the alternative which
states that a difference exists between the Time means in the population (R² = .13).
Question 2 To what extent do four personality traits affect Self-Efficacy for Technology Integration
on the subscale level?
Four personality traits from Long and Dziuban Learning Style Inventory: Phobic, Obsessive,
Impulsive, and Hysteric were treated as independent variables and they are dichotomous variables.
Dependent variables entailed SECT, SEGU, and SER scores.
A t-test for independent samples was conducted (see Table 1).
Table 1 T-Test for Self-Efficacy for Technology Integration on the Subscale Level by Personality Trait
Phobic Obsessive Impulsive Hysteric
Self-Efficacy df t df t df t df t
Time 1
SECT 169 -2.22* 167 1.90 167 -1.34 166 -1.62
SEGU 169 -2.46* 167 1.75 167 -1.23 166 -2.10*
SER 169 -2.34* 167 1.54 167 -1.82 166 -1.27
Time 2
SECT 162 -1.60 160 1.11 160 -.137 160 -1.37
416
SEGU 162 -.10 160 .77 160 -.70 160 -.87
SER 162 -.11 160 1.20 160 -1.69 160 -1.42
Note. *p < .05.
Question 3 What demographics variables can affect Self-Efficacy for Technology Integration on the
subscale level?
Four dichotomous variables were treated as independent variables: Sex (male vs. female), Work
(no fewer than 20 hours vs. no more than 20 hours), PC Use (more than six years experience vs. no more
than six years experience), and Internet Access (yes vs. no). Dependent variables entailed SECT, SEGU,
and SER.
A t-test for independent samples was conducted (see Table 2).
Table 2 T-Test for Self-Efficacy for Technology Integration on the Subscale Level by Demographics
Sex Work PC Use Internet Access
Self-Efficacy df t df t df t df t
Time 1
SECT 167 .99 170 -.08 167 -3.27** 26.82 1.67ª
SEGU 167 .48 170 .37 167 -3.13** 26.66 1.34 ª
SER 167 .43 170 .18 140.1 -.2.63 ª ** 167 1.69
Time 2
SECT 161 -.21 161 -1.45 161 -4.06*** 161 1.57
SEGU 161 -.03 161 -1.3 161 -4.78*** 161 2.35*
SER 161 -1.07 161 -2.12* 161 -4.45*** 161 1.77
Note. ªThe assumption of equal variances was not met.
*p < .05. **p < .01. ***p < .001.
Conclusions
In attempt to investigate the interaction effect between Time One and Time Two across the three
semesters and its impact on the students’ Self-Efficacy for Clinical Teaching, we were not able to find any
significant interaction between the two time occasions and the three semesters. However, the results
showed that Self-Efficacy for Clinical Teaching on Time One is different from that on Time Two,
suggesting that these student teachers’ confidence in applying technology into specific classroom settings
increased upon completion of the mandated computer literary course. A similar result was also found in
both Self-Efficacy for General Use and Self-Efficacy for Responsiveness. Though, we were not able to find
the difference among the three semesters on student confidence level of technology use in clinical teaching
(SECT), general settings (SEGU), or responding to student needs (SER). All these suggested (a) students
enrolled in Fall 2004, Spring 2005, and Summer 2005 were not different in the three areas of confidence
from one another, and (b) students completing this course in all three semesters were not different from one
group or another in their confidence in the three areas. The former may have been that education majors
tend to be homogeneous in terms of their self-confidence in instructional use of technology. The latter may
have been that the design and implementation of the course has been consistently carried out. As a matter
of fact, these cross-semester datasets were collected in nine course sections instructed by one experienced
field-based instructor. Our interview with the instructor indicated that the instructor managed to encourage
a risk free learning environment. Students in his class received both individual attention from the instructor
and personal assistance from their peer.
Using a t-test for independent samples, we were able to study the means difference in SECT,
SEGU, and SER between student groups of each personality trait at each time point and in each semester.
Results showed that (a) in the beginning of the class students with a Phobic type of personality tend to be
less confident than their counterpart in all the three areas, (b) in the beginning of the course students with a
Hysteric type of personality tend to be less confident than their counterpart in general use of technology in
the classroom setting (SEGU), and (c) at the end of the class the previous two results were not found.
Students who were not as adventurous as others appeared to have less confidence in them when it comes to
integrating technology in the future classroom. After the treatment or intervention, i.e., the computer
literacy course, students’ overall confidence level seemed to increase to an extent where no significant
difference between the two groups of student teachers was detected. A similar development may have
417
occurred to students with a Hysteric type of personality. Upon completion of the mandatory technology
literacy course, Hysteric type students (who tend to lose control of their emotional boundaries more easily
or are more likely to overreact in general than others) may have found a way to control their emotion and
mood and to reduce the chances of allowing their personality to affect their learning. The instructor
commented, “I first calmed them [being Hysteric type of personality] down and told them this is not the
end of the world. I also said the same thing via the email...” As far as why such type of personality
(Hysteric) did not affect the other two confidence areas, i.e., SECT and SER, in the first place, this may
have been that the strengths of a hysteric learner, e.g., thinking creatively and artistically, were reinforced
during the course, which, in turn, increased their confidence in general use of technology close to the end of
the course. The technology course provided opportunities for students to demonstrate their creativity and
artistic talent in course projects and assignments, e.g., newsletters and flyers.
Question Three deals with four demographics variables that possibly concealed moderating
information on these pre-service teachers’ confidence in using computer technologies in specific and
general classroom settings for teaching and motivation purposes. The four dichotomous variables were Sex,
Work, PC Use, and Internet Access. Using a t-test for independent samples, we found pre-service teachers
working as full timers (no fewer than 20 hours a week) did not express as much confidence as their
counterpart in the use of technologies to respond to their future students’ needs and to motivate them. This
lack of confidence may have been due in part to the fact that these future teachers and at least part-time
(often full time) employees may have perceived that their efforts were simply spread “too thin” for them to
feel that they were delivering to the best of their ability. Thus, they might not feel confident enough to
ascertain that they were able to spend enough effort or time motivating their future students with the
technologies in addition to using those technologies to teach (a minimum requirement). Furthermore,
students with over six years of using a computer seemed to report a higher confidence level in the three
self-efficacy areas when compared to those students with no more than six years of experiences using a PC.
This finding may dissolve gradually as computers in the Lower Grande Valley become receptive and PC
use receives attention. A continuing study of the computer use and self-efficacy for technology integration
is recommended. Another significant finding was reported in that pre-service teachers with Internet access
where they studied tend to have a higher confidence level of using technologies in the classroom setting in
general. This may have to be the course design that required a large amount of the Internet use in the course
projects. As the matter of truth, the course syllabus revealed that course projects and assignments involved
Internet search for information like acceptable technology use, examples of technology’s impact on society
as a whole, and assistive technology for students with special needs. Most importantly, Blackboard course
management system was adopted in these nine sections of the course. Lastly, Sex was not found a
moderating factor in all three self-efficacy areas on either Time One or Time Two, suggesting the
confidence levels for males and females did not differ to a significant degree. Aside from a longitudinal
study previously suggested, a cross-tabulation of demographics and personality traits may help explain
some blind spots in this study. To name one, student participants possess more than one personality trait.
The six datasets were gathered in one university setting, so generalizablity of the results is limited to
similar settings.
References
Bayston, T. E., Jr. (2002). A study of reactive behavior patterns and online technological self-efficacy.
Unpublished doctoral dissertation, University of Central Florida, Orlando.
Ouellette, R. (2000). Learning styles in adult education. Retrieved October 1, 2005,
http://polaris.umuc.edu/~rouellet/learnstyle/learnstyle.htm
Pan, C. (2003). System use of WebCT in the light of the technology acceptance model: A student
perspective. Unpublished doctoral dissertation, University of Central Florida, Orlando.
Wang, L., Ertmer, P. A., & Newby, T. J. (2004). Increasing preservice teachers’ self-efficacy beliefs for
technology integration. Journal of Research on Technology in Education, 36(3), 231-250.
418
A Preliminary Study of the Uses and Effects of Mobile Computing
Devices in K-8 Classrooms
Karen Swan
Mark van ‘t Hooft
Annette Kratcoski
Kent State University
Darlene Unger
Virginia Commonwealth University
Abstract
This preliminary study employed mixed methodologies to explore students’ use of mobile
computing devices and its effects on their conceptual understanding, motivation to learn, and
engagement in learning activities. Data was collected from students in four elementary and two 7th
grade science classes in Northeast Ohio. Data collected included usage logs, student work samples,
student and teacher interviews, and classroom observations of selected special needs students.
Findings highlight the personalization of learning afforded by such devices both in terms of individuals
and individual classroom cultures, as well as their usefulness in extending learning beyond the
classroom. They also suggest that the use of mobile computing devices may help lessen the gap in
academic achievement between special needs and regular students.
Objectives
The purpose of this study was to explore the uses elementary and middle school students make of
mobile computing devices both within and outside of the classroom, and the effects such usage has on their
achievement and motivation to learn. A particular focus of the study was the use of such devices by special
needs students and the effects that might have on their achievement relative to other class members and on
their motivation to learn and engagement in academic tasks.
419
stigmatizing to individuals with disabilities (Furniss, 2001). Furthermore, the use of mobile computing
devices by students with disabilities has the potential to increase students’ independence and serve as a
motivational tool for completing academic work they view as cumbersome, challenging, or not engaging.
Academic research on the impact of handheld computers on teaching and learning is still relative
scarce; research on mobile computing devices like DanasTM is virtually nonexistent. This preliminary
research study was designed to begin to explore such use and its effects by asking the following questions:
• How do students (especially those with special needs) use mobile devices to support learning?
• How does the use of mobile devices affect their motivation to learn and engagement in learning?
• How does student use of mobile devices support their learning, especially their conceptual
understanding?
Methodology
Data were collected from two sites and two sets of subjects. The first site was a technology-rich
laboratory classroom at a state university where local teachers bring their classes every day for six weeks
to complete particular units of study. Classes and subjects involved in this study included one sixth grade
class (28 students, 6 special needs), two fourth grade classes (41 students, 6 special needs), and one third
grade class (16 students, 5 special needs). All students in these classes were given mobile devices which
they were allowed to use anywhere and anytime for a six-week period. The second site was a suburban
middle school where students in two (out of five) seventh grade science classes were given mobile
devices to use in science and to take with them for approximately half the year. The classes given mobile
devices were chosen for their high concentration of special needs students. Of the 50 students in these
classes, 17 were identified as having special needs (34%).
Data collected from all classes included lesson plans, device usage data (collected using
RubberneckTM from GoKnow®), work samples (collected using PAAMTM from Go Know®), and teacher
and student interviews. Usage data was collected from all students for which it was available (equipment
failures made some data inaccessible and some students did not use the devices). These data were
converted to usage in minutes per week and frequencies were compared across applications and classes.
Student work samples were obtained from four students in each class identified by their teachers as
special needs or as high, medium, and low achieving. The work was analyzed for conceptual
understanding based on a framework developed by Newmann (Newmann & Wehlage, 1995; Newmann,
Bryk & Nagaoka, 2001), which focuses on students’ use of analysis skills, their depth of understanding,
and their ability to communicate their understanding of material learned. Work samples were also
compared within classes. Teachers and students in all classes were interviewed with regards to students’
use of the mobile tools and their effects on learning and motivation. Data were analyzed qualitatively
using a constant comparison approach to detect emergent themes (Glaser, 1978). In addition, eight
special needs students in the middle school science classes were selected for behavioral observations
using the Behavioral Observation of Students in Schools (BOSS) protocols (Shapiro, 2004). Five
observations were completed before the mobile computers were introduced into the classes and four
were completed during classes when the mobile devices were used, allowing for comparisons based on
technology use.
Results
Mobile Device Use
All teachers in the study introduced their students to the use of the mobile devices, required their
use of them for specific assignments (Table 1), and encouraged students to use them as needed both in class
and outside of it. However, between October 2003 and March 2004 a total of 27 mobile devices (26.5% of
the total number in the study) were returned for exchange or repair and two were repaired on-site. The
teacher at the middle school experienced this as a major problem. Consequently, in February she no longer
made use a requirement, but rather an option for her students. Despite the technical issues, most students
continued to use the mobile devices, but such usage was not incorporated into regular class lessons. While
equipment failure was less of a problem in the laboratory classroom because of on-site technical support,
teachers there did not make as much specific use of the mobile devices because of the ready availability of
desktop computers and other technologies (1:1).
420
6 note-taking, journaling, first draft of autobiography
4 note-taking
4/3* note-taking, journaling, worksheet
*classes worked on same unit
The most striking characteristic of the usage data is its variability, both between individuals and
between classes, which highlights the ways in which mobile devices were appropriated by individual
students and student cultures to personalize learning (Table 2). Notice for example the significant
differences between the two fourth grades and between fourth and third grade students given the exact
same assignments. This finding is supported by the student interviews. Over three-quarters of the students
interviewed reported using their mobile devices outside of the classrooms in which they were explicitly
assigned -- in other classes, at home, on the bus, and in after-school programs. Students in the middle
school classes mostly reported using their mobile devices for note-taking, while elementary students said
they used them for a variety of writing activities, noting that they preferred using the mobile devices to
writing by hand. Many students also reported that they found the mobile computers to be most useful for
various types of organizational activities (scheduling, creating “to-do” lists, outline ideas). Students also
reported enjoying the use of drawing programs and games.
Table 3 compares usage by gender and by whether or not students were identified as special needs
across classes. Again, these data indicate the variability between classroom cultures but do show a tendency
for girls to use the mobile devices more than boys, although this collapsed data doesn’t show the enormous
variability between individuals and across applications. The data also reveal a disappointing slightly lower
use of the mobile devices among special needs students in all but the seventh grade. Interestingly, the
collapsed data here conceals greater usage among special needs students of drawing and applications
programs, a finding that perhaps deserves further investigation.
In a nutshell, our preliminary findings indicate that the use of mobile computing devices extends
learning outside the confines of the classroom, and that individuals adapt the use of mobile computing
devices to their own needs. Special needs students made greater use of drawing and organizational
applications on their devices than did regular students. The data further suggest unique cultures of use
evolved within classes and groups within classes, indicating higher levels of personal appropriation.
Findings also indicate that mobile computing devices were used most often for writing activities and that in
most cases students favored such use over pencil and paper. The data also show that girls tended to use
mobile computing more than boys, and that regular students used them more than special needs students,
although these findings seem related to cultures of use that evolved among particular students and
subgroups.
421
Motivation
Teachers also stated that they believed that students’ motivation was improved by the use of
mobile devices. Most teachers also commented on increased student productivity as a result of this
technology use. For example, the sixth grade teacher commented that taking the mobile devices home
resulted in everyone’s homework always being done. Behavioral observations of selected special needs
students supports these findings and suggests that increased conceptual understanding and productivity may
be related to increased engagement resulting from the use of the mobile devices. In comparing individual
student data collectively, the data indicate that, for all of the students observed, the percentage of off-task
behaviors were markedly reduced when the mobile devices were used. Not only were the behaviors
reduced but they were virtually extinct, with off-task behaviors occurring less than 5% of the total
behaviors observed for each student. Interestingly, the findings indicate that students displayed more
actively engaged behaviors during the classes in which they were instructed to use their mobile devices,
regardless of whether students were actually using the tool during the entire class period. In other words, if
the mobile computers were on the students’ desks and they were using them intermittently during class the
students were more actively engaged with class lecture and activities.
It should be noted that for many of the special needs students observed, wide variations in the
percentages of actively engaged behaviors were recorded across the total time period of observations
during which mobile computing devices were used. For instance, four students with special needs
demonstrated high percentages of engaged behaviors during four of the five observation periods but during
one class period the percentage of engaged behaviors dropped noticeably. For three of the four students,
however, this phenomenon occurred during the same class period, suggesting that this was probably a
localized event. Nonetheless, future investigations should look more closely at classroom activity structures
to explore events which both trigger disengagement and sustain engagement among special needs
populations.
Learning Support
Most of the teachers interviewed discussed improvements in student work as a result of mobile
computing use, focusing on increased productivity, improvements in writing (such as spelling and
mechanics), and enhanced student approaches to the writing process. For example, one teacher remarked,
“The [mobile computing devices] shortened the time frame for getting work done. Having the
[mobile computing devices] also improved the writing of all students.”
Another teacher commented that the use of the devices resulted in noticeable improvements in both writing
and peer editing. She stated,
“The biggest change has been in their weekly journals. We have been journaling all year and
they have always written them but in using the [mobile computing devices] , peer editing
takes on so much more meaning when they can beam to someone rather than trading papers.
With the [mobile computing devices] they are editing their own writing more and it keeps
getting better.”
As previously noted, students in the laboratory classes reported that they preferred using mobile
computers over writing things by hand and that using mobile computing for writing assignments made the
work “easier” and “more fun”. The majority of students in these classes also stated that they thought their
written work in particular improved as a result of their use of the devices. For example, one student stated,
“My writing is poor and the [mobile computing devices] makes it easier to read my writing.”
Many of the teachers interviewed also commented on ways in which the use of the mobile
computers seemed to lessen the gap in academic achievement between regular and special needs students.
For example, one teacher stated,
“Having the [mobile computing devices] improved the writing of all students but special
education students in particular;”
422
while another noted that,
These observations are supported by work samples obtained from 3-4 targeted students per class,
representing high, average and lower ability levels, and when available, special needs students, and
analyzed for evidence of conceptual understanding (on a scale of 3-12, with 12 being the highest). Across
elementary classes, the artifacts averaged a score of 10.0 for higher ability students, 9.4 for students of
average ability, 8.5 for lower ability students, and 9.3 for students with special needs. Across seventh grade
science classes given Mobile devices student works samples averaged 7.2 for high ability students, 5.5 for
average ability students, 4.0 for lower ability students, and 4.7 for special needs students.
Interviews with students also corroborate findings that the use of mobile computing devices may
enhance student learning. Fifteen of the eighteen middle school science students interviewed stated that
they believed their use of mobile computing tools helped them in their school work. These students
particularly noted their use for taking notes, test review, calculations, and the ways in which keeping their
work on the mobile computing devices helped them stay organized. Indeed, all students interviewed
seemed to view mobile computing devices as a tool that could help them with their school work. This
aspect of the use of such devices surely deserves further investigation.
In summary, findings from this preliminary study provide some indication that the use of mobile
computing devices can improve student learning. In particular, they provide evidence of higher levels of
conceptual understanding among students using mobile computing when assignments elicit it. Perhaps
more importantly, the results suggest a lessening of the gap in conceptual understanding between regular
and special needs students using mobile computing devices. Interviews with teachers suggest that the use of
mobile computing resulted in greater productivity and improved writing skills among their students.
Teacher interviews also suggest that mobile computing devices may provide increased support for the
writing process. Interviews with students suggest that students likewise view mobile computing devices as
a tool which can help them with their schoolwork. These findings surely deserve further investigation and
moreover should inform future research.
Educational Significance
In conclusion, this preliminary investigation on the use of mobile computing devices shows that
elementary and middle school students use them in a variety of ways, principle among these writing, both
in and outside of class. The findings also suggest both the personalization of learning supported by such
devices and their usefulness in extending learning beyond the classroom. They also hint at the influence of
classroom cultures on such use. In addition, the results indicate that use of mobile computing devices may
increase students’ motivation to learn and their engagement in learning activities, especially among special
needs students. Indeed, they suggest that the use of mobile computing devices may help lessen the gap in
academic achievement between special needs and regular students. This is an important finding which
clearly deserves further investigation.
References
Bigge, J.L., Best, S.J., & Heller, K.W. (2001). Teaching Individuals with Physical, Health, or Multiple
Disabilities (4th Ed.). Upper Saddle River, NJ: Prentice-Hall.
Davis, D. K., Stock, S. E. & Wehmeyer, M. (2002). Enhancing independent task performance for
individuals with mental retardation through the use of handheld self-directed visual and audio
prompting system. Education and Training in Mental Retardation and Developmental
Disabilities, 37(2), 209-218.
Elias, M.J., & Friedlander, B.S. (2001). Engaging the Resistant Child through Computers: A Manual to
Facilitate Social Emotional Learning. NY: National Professional Resources.
Furniss, F. (2001). VICAID: Development and evaluation of a palmtop-based job aid for workers with
severe developmental disabilities. British Journal of Educational Technology, 32(3), 277-288.
Glaser, B. (1978). Theoretical sensitivity: Advances in the methodology of grounded theory. Mill Valley, CA:
Sociology Press.
Newmann, F. M., Bryk, A. S. & Nagaoka, J. K. (2001). Authentic intellectual work and standardized tests:
conflict or coexistence. Chicago, IL: Consortium on Chicago School Research
423
Newmann, F.M., & Wehlage, G.G. (1995). Successful school restructuring: A report to the public and
educators. Madison, WI: Document Service, Wisconsin Center for Educational Research.
Roschelle, J. & Pea, R. (2002) A walk on the WILD side: how wireless handhelds may change computer-
supported collaborative learning. International Journal of Cognition and Technology, 1 (1), 145-
272.
Shapiro, E.S. (2004). Direct observation: manual for the Behavioral
Observation of Students in Schools (BOSS). In E.S. Shapiro (Ed.), Academic
Skills Problems Workbook. NY,NY: Guildford Press, 31-46).
Smith, L., Beard, L., & Ezell, D. (2003). AlphaSmart: Useful technology for Students with Disabilities.
Retrieved August 27, 2003, from http://www.uncw.edu/cte/et/Resnotes/Smith/.
Soloway, E., Norris, C., Blumenfeld, P., Fishman, B., J., K., & Marx, R. (2001). Devices are ready-at-hand.
Communications of the ACM, 44(6), 15-20.
Tinker, R., & Krajcik, J. (Eds.). (2001). Portable Technologies: Science Learning in Context. New York:
Kluwer Academic/Plenum Publishers.
Vahey, P., & Crawford, V. (2002). Palm Education Pioneers Program: Final Evaluation Report. Menlo
Park, CA: SRI International.
van ‘t Hooft, M., Diaz, S., & Swan, K. (2004). Examining the potential of handheld computers: findings
from the Ohio PEP project. Journal of Educational Computing Research, 30, 4, 295-311.
Zhang, Y. (2000). Technology and the writing skills of student with learning disabilities. Journal of
Research on Computing in Education, 32(4), 467-479.
424
The Changing Nature of Learning in a Ubiquitous Computing
Classroom
Karen Swan
Annette Kratcoski
Yi Mei Lin
Jason Schenker
Mark van ‘t Hooft
Kent State University
Abstract
This paper reports on preliminary findings from an ongoing study of teaching and learning in a
ubiquitous computing classroom. This study employs multiple measures and mixed methods to document
changes in teaching and learning that result when teachers and students have access to a variety of digital
devices wherever and whenever they need them. It identifies ways in which ubiquitous computing
environments can support both individual (conceptualizations) and social (uses) construction of
knowledge. In particular, it explores the role that the unique representations of knowledge afforded
through the use of a variety of ready-at-hand digital devices can play in supporting and bridging private
and public knowledge construction
Background
The term “ubiquitous computing” was introduced by Mark Weiser (1991) who wrote, “The most
profound technologies are those that disappear. They weave themselves into the fabric of everyday life
until they are indistinguishable from it” (p. 94). He envisioned computers embedded in the environments
we inhabit – in walls, chairs, clothing, light switches, appliances, everything -- and connected to each other
and the world through wireless communication. Alan Kay (in Johnstone, 2003), for example, argued as
early as the late 1970s that computing would only make a difference in people’s lives if it were to become
universally available, which he equated with affordable and portable. He thus envisioned a handheld,
notebook-sized computer for kids. Seymour Papert (1980) similarly predicted “a massive penetration of
powerful computers into people’s lives” (p. viii), and with it a paradigm change in teaching and learning.
Papert’s vision focused on 1:1 computing and learner-centered environments in which children
programmed computers rather than being programmed by them.
It is important to note that while these three early visions of ubiquitous computing all view it as
having the potential to induce paradigm change in education on the scale of that resulting from the
introduction of printing, the three visions are quite different in focus. Weiser saw ubiquitous computing as
involving many computers serving each individual and embedded in inhabited environments, whereas Kay
envisioned mobile computers that could be carried into environments, and Papert saw one to one
computing as the key element regardless of devices. It is also important to note that all three visionaries
were writing before the World Wide Web was introduced, radically changing, according to Chris Dede
(2000), the way that teachers and students think about learning with technology, and the possibilities
inherent in ubiquitous computing (McClintock, 1999).
In our work, we view ubiquitous computing as encompassing all three notions of ubiquitous
computing as well as the importance of connectivity via the WWW. We view ubiquitous computing
environments as learning environments in which all students have access to a variety of digital devices,
including computers connected to the Internet and mobile computing devices, whenever and wherever they
need them. Our notion of ubiquitous computing, then, is more focused on many to many than one to one,
and so includes the idea of technology which is always available but not itself the focus of learning.
Although ubiquitous computing research has involved differing technological implementations –
1:1 computing (Apple Computer, 1995; Honey & Henriquez, 2000); laptop computers (Stevenson, 1998;
Ricci, 1999; Bartels & Bartels, 2002; Hill, Reeves, Grant, Want & Han, 2002; Rockman, 2003; Silvernail
& Lane, 2004; Zucker & McGhee, 2005), handheld computing (Robertson, Calder, Fung, Jones, O’Shea &
Lanbrechts, 1996; Inkpen, 2001; Sharples, 2002; Vahey & Crawford, 2002; Roschelle, 2003; Norris &
425
Soloway, 2004; Roschell, Penuel & Abrahamson, 2004) – and so perhaps differing visions, the changing
nature of teaching and learning in ubiquitous computing environments that is therein documented appears
relatively consistent across implementations.
Across implementations, for example, researchers have found much greater use of Internet
resources (Honey & Henriquez, 2000; Hill, et al., 2002; Zucker & McGhee, 2005) and significantly more
presentations communicating findings (Honey & Henriquez, 2000; Hill, et al., 2002). They have found a
much greater variety of representations being used to explore, create and communicate knowledge (Apple
computer, 1995; Honey & Henriquez, 2000; Bartels & Bartels, 2002; Danesh, Inkpen, Lau, Shu & Booth,
2001; Hill, et al., 2002; Roschell, et al., 2004) including the use of a much wider variety of visual
representations, spreadsheets and databases, simulations, and exploratory environments.
Perhaps as a result, researchers are also documenting changes in interactions among students and
between students and teachers (Apple Computer, 1995). They find that learning is becoming more efficient
(Apple Computer, 1995; Hill, et al., 2002) and that students are becoming “experts” on particular topics
(Apple, 1995; Norris & Soloway, 2004). In addition, researchers note significant increases in
collaboration, among students and between students in teachers, in ubiquitous computing classes (Apple
Computer, 1995; Robertson, Calder, Fung, Jones, O'Shea, & Lambrechts, 1996; Hennessey, 2000;
Sharples, 2000; Roschelle & Pea, 2002; Vahey & Crawford, 2002; Norris & Soloway, 2004).
Happily, researchers are also documenting positive effects of ubiquitous computing on students.
They are finding improved motivation (Apple Computer, 1995; Ricci, 1999; Vahey & Crawford, 2002;
Zucker & McGhee, 2005); engagement (Silvernail & Lane, 2004; Zucker & McGhee, 2005); behavior
(Apple Computer, 1995), and school attendance (Apple Computer, 1995; Stevenson, 1998) among students
involved in ubiquitous computing initiatives. In addition, research shows such students are better
organized (Ricci, 1999; Zucker & McGhee, 2005) and more independent learners (Apple Computer, 1995;
Zucker & McGhee, 2005). Perhaps more importantly, researchers have documented increased media
literacy (Hill, et al., 2002; Rockman, 2003), improved writing (Apple Computer, 1995; Ricci, 1999; Vahey
& Crawford, 2002; Rockman, 2003), and, in some cases, increased scores on standardized tests (Stevenson,
1998; Honey & Henriquez, 2000). In addition, researchers are finding that ubiquitous computing “levels
the playing field” for special needs and lower ability students (Stevenson, 1998; Honey & Henriquez, 2000;
Hill, et al., 2002).
426
researchers have developed a model, grounded in their SBCAC experiences, that locates such effects in
three broad areas: in the ready availability in ubiquitous computing environments of a wide variety of
external, material representations of knowledge; in the particular supports ubiquitous computing provides
for individual students’ internal conceptualization and construction of knowledge; and in the unique social
interactions and shared uses of knowledge ubiquitous computing enables, through and around which
knowledge is constructed. (Swan, Kratcoski, Diaz, van ‘t Hooft & Juliana, 2004). We use the terms
“representations,” “conceptualizations,” and “uses” respectively to distinguish these domains, and view
them as interacting and interdependent in their effects. Distinguishing them also allows us to refine our
investigation into the effects of ubiquitous computing to exploring its effects in these three domains.
Accordingly, the study reported in this paper was thus designed to explore the following research
questions:
What kinds of external representations of knowledge do teachers and students employ to support
learning when they have ubiquitous access to a variety of digital computing devices?
(How) does such ubiquitous access affect students’ conceptualizations of knowledge?
(How) does ubiquitous access affect the ways students use knowledge and the social interactions around
which knowledge is constructed?
Methodology
Subjects and Setting
The Research Center for Educational Technology (RCET) is located at Kent State University in
northeast Ohio. Each year RCET brings eight local teachers and their classes to spend half their day every
day for six weeks in Kent’s SBC Ameritech Classroom (SBCAC). Participating teachers are chosen
through a selection process which begins with nomination by their principals. Teachers are chosen by
RCET staffed based on the quality of their teaching and their fit with RCET research interests. They are
not chosen for technology integration experience. The study reported in this paper is based on classes
visiting the SBCAC in the 2003/04 and 2004/05 school years. Table 1 below shows the classes that came
to the SBCAC in the 2003/04 and 2004/05 cohorts. Participating teachers included three men and nine
women ranging in age from their late twenties to late forties. Participating classes explored regular
curricular subjects in integrated units that focused on topics in the areas of English language arts, science,
and mathematics. Classes ranged from 14 to 27 in number of students, with approximately equal numbers
of boys and girls. All classes except the seventh graders (who were selected to fit with scheduling needs)
were regular, intact classes. Students in all classes ranged in ability levels and all but two included special
needs students. Many classes included minority students.
427
6 Energy 17
Results
The study reported in this paper was designed to explore the changing nature of teaching and
learning in ubiquitous computing environments in terms of changing representations, conceptualizations,
428
and uses of knowledge. In the sections which follow, research findings are organized around these research
questions. The results suggest meaningful changes in the nature of teaching and learning in each of these
areas.
Representations
“Representations,” as it is used here, broadly refers to the myriad of ways human beings externally
represent what they know. As McClintock (1999) notes, digital technologies provide easy and flexible
access to multiple ways of representing knowledge and expressing ideas, giving rise to new possibilities for
teaching and learning. By examining what kinds of representations teachers and students in a ubiquitous
computing classroom employ in their normal course of study, we can begin to explore how they make use
of that potential.
Indeed, teachers and students in 2003-2004 SBCAC cohort employed a remarkable variety of
representations to support learning. For example, kindergarten students used digital photography,
tessellation software, a music composition program, and the Logo robotic turtle explore patterns. Sixth
graders used audio recorders and handheld computers to collect family stories and recipes. They used
Inspiration to create family trees and family crests as well as for brainstorming ideas. One fourth grade
classes used time-lapse photography to document carnations’ absorption of water and the BugScope
Electron Microscope at the University of Illinois to view plant samples in an experiment on water quality.
These students created videos, webpages and Powerpoints to share their findings. Fifth graders participated
in stream quality research using science probes to collect water temperature and Ph values, handheld
computers to record their findings, and videoconferencing to communicate them with state officials and
others students across the state of Ohio. And the list goes on.
It is important to note that teachers were not required to use any technologies. They were
introduced to what was available and encouraged to just use those that met their curricular goals. All
teachers incorporated Internet research into their units, and all used Powerpoint presentations to share their
findings. All teachers incorporated word processing and/or desktop publishing into their lessons. While
these might seem mundane uses of technology, participating classes lacked the resources in their regular
classrooms to incorporate these kinds of representations into whole class activities in a meaningful way. It
should also be noted that even by choosing digital representations that in many ways were closest to
traditional representations, teachers and students experienced new possibilities in terms of access to
information, visual representation, and digital tools.
Participating teachers also encouraged their students to use a variety of digital devices to help
students explore their topics both in and outside the classroom. Many teachers helped their students
communicate with others via email, and most used video teleconferencing to connect their students with
experts on the topics they were studying, as well as with students in Mexico studying similar topics. Most
classes used concept-mapping, graphing, and spreadsheets to organize and explore ideas and data. All but
the kindergarten teachers developed extended projects in which students demonstrated their learning
through technology-based presentations. One class created bound books using desk-top publishing software
which included digital photographs and graphics.
Most importantly, all the teachers in the 2003/2004 school year utilized the available technologies
to support their teaching and learning goals. They used differing technologies to meet differing learning
objectives, often in very creative ways. They also developed ways of assessing technological products and
explored new ways to use technology to enhance assessment of student learning, including electronic
portfolios, electronic journaling, and/or observational software on their handhelds to assess student
learning.
Table 2 below compares technology use in regular classrooms with technology use in the SBCAC.
It should be noted that at least two observations were made of each class in each setting, and that the data is
averaged across all classes in the 2003-2004 cohort. Thus the findings are derived from a sample of classes
that may or may not represent “typical” classes. The findings given here are also averaged across classes
and so obscure substantial differences between classes in both settings. Nonetheless, they do reveal a much
greater use of digital technologies in the SBCAC and a much greater use of print technologies in regular
classrooms. While that is, of course, to be expected, the results suggest that the kinds of representations
teachers and students were employing in their teaching and learning were meaningfully different in the
ubiquitous computing classroom. The findings suggest a much greater reliance on written language for
representing knowledge in the traditional setting and a move towards more visual representations of
knowledge in the SBCAC. It is interesting, to note that classes in the SBCAC even made more use of
429
video and art supplies than classes in their regular setting where they are presumably equally available.
Table 2
Uses of Technology Observed in Structured Classroom Observations in Regular and SBCAC Classrooms*
Regular
Technology Use Classroom SBCAC
computers (C) 30.4 78.6
Internet (I) 13.6 54.4
handhelds (H) 0.0 2.2
video/film (V) 0.4 6.5
audio only (AO) 0.0 0.0
overhead (OV) 2.0 2.6
Elmo (E) 0.0 1.5
screen (SC) 2.3 4.8
presentation system (PS) 0.3 12.7
textbooks (T) 1.6 0.0
print materials (P) 34.6 15.3
manipulatives (M) 0.3 0.0
realia (R) 4.3 4.4
art supplies (A) 0.0 3.3
paper & pecil/pen (PA) 52.8 29.3
boards & chalk/markers (B) 31.3 0.9
microscopes/probes/sensors** 0.0 0.0
(S)
other (O) 10.2 19.2
* as in many cases multiple technologies were simultaneously employed, numbers do not add up
to 100
Indeed, in their post SBCAC interviews and reflections, teachers noted the effects of ubiquitous
access to computing on the kinds of representations of knowledge they used in their classes. For example,
one teacher stated, “The children all had electronic portfolios, and our “daily reflections” were done using
the digital camera and my laptop.” Another commented on ways digital representations enhanced students’
learning about the writing process itself, “Students got a better idea of editing and publishing from being
able to share their work publicly. Students also benefited from 1:1 access to computers in honing their
information searching and evaluation skills. They became more reflective and better writers, perhaps
through group revisions, and got good practice typing.” Most teachers also commented on the ways ready
access to digital technologies allowed them to incorporate visual representations more easily and more
frequently into their lessons. For example, one teacher noted,
“Kids today are more visual learners than ever before. Lessons created by teachers need to
visually rival video games, television, and DVDs. During my experience in the classroom, students had an
opportunity to create digital representations of their knowledge via a website, PowerPoint and digital
movies. Students were very engaged in these projects and took great pride in the quality of their work.”
Indeed, interviews with students demonstrated the ways in which they thought seriously about
knowledge representation, perhaps because they were given more choices. Students were able to discuss
decisions they made with regards to technology choices when representing specific concepts or ideas in
their product creation. For example, in discussing the webquest she had created on “The Wild West” a fifth
grade student explained how she selected font and background colors consistent with Southwestern style
art, while another classmate who created a webquest on hurricanes chose grays and blues for fonts and
backgrounds to represent storms and clouds.
Conceptualizations
“Conceptualizations” as used here refers to the unique ways in which knowledge is organized,
processed and manipulated in individuals’ minds. Although it is, of course, impossible to examine the
430
inner workings of students’ minds, we are exploring ways in which ready access to digital technologies
affects student conceptualizations by examining a variety of evidence including gains on tests of important
concepts, student work samples, quasi-clinical and student and teacher interviews. The evidence suggests
that students in the SBCAC learned to use a variety of technologies as thinking and learning tools, and that
such usage supported their conceptual learning.
For example, in the quasi-clinical interviews, the majority of students were able to describe in
great detail the project they were working on including key concepts represented in their work. They also
told us that they thought they learned more in the SBCAC and attributed their enhanced learning to the
“fun” they had using digital technologies. One student told us, “I think you learn more if it’s fun because if
it’s fun it helps you concentrate and listen.” Another said, “You want to have fun and learn at the same
time. If you are bored you don’t learn as much because you don’t want to focus in to it.”
Teachers similarly commented on the (sometimes profound) effect ubiquitous computing had on
student engagement and motivation, noting that these are a necessary first step in higher order learning.
Indeed, all the teachers we interviewed mentioned their surprise at how much ubiquitous access to
computers and handheld devices affected their students’ engagement in learning. For example, one teacher
stated, “From my experience in the SBCAC, I realized the excitement of students when they can see the
quality of the work they are creating.” Another teacher noted that when she gave homework assignments to
be completed on mobile computing devices, all her students got them done, something she had never before
experienced. Similarly, another told us, “The one benefit I’ve noticed is that they do write more with the
Danas. And I believe that much as occurs with reading, the more you write, the better a writer you
become.” Still another teacher noted that the engagement generated around the ubiquitous computing
classroom made it possible for her to change her pedagogical approaches, in particular, to individualize her
teaching, “I was able to work one-on-one with a lot of students because the others were so completely
engaged in their own projects.”
One teacher summed the changes in motivation and engagement as follows, “Learning was more
efficient, students were busier. There was some fooling around at the beginning, but in general students
were more engaged, more motivated, more on task, freer.”
Teachers also believed that ubiquitous access to digital technologies affected the quality of
students’ work, and attributed at least some of that increased quality to the kinds of supports differing
technologies gave to particular kinds of learning. For example, several teachers spoke about using mobile
devices to support peer editing which they thought “seemed to make individual sharing and peer tutoring
work better.” Teachers additionally noted that ubiquitous computing seemed to be particularly supportive
of project-based and inquiry learning. One told us, “With my students, I’ve noticed they are really much
more inquisitive. The higher achieving kids take learning to the next step, and I see the other kids trying to
do the same.”
Such comments provide further evidence of changing pedagogical possibilities in ubiquitous
computing environments. One teacher’s comments epitomize the ways working in such environments can
expand teachers’ notions of what is possible, “The importance of technology for young children was
reaffirmed and I learned about the capabilities of the children when using technology. I learned to think of
technology as a tool that adds another dimension to learning.”
That students did use the technologies available in the SBCAC to learn is demonstrated by the
gains students made on tests of factual and conceptual understanding (Table 3). The average effect size of
gains across classes from pre to post testing was 1.00, or one full standard deviation, for the 2003/04
students, and 2.44 for those in 2004/05 classes. While it may not seem surprising that students learned
what teachers wanted them to, it is important to document that students did learn the intended content and
not just technological skills, and the gains are impressive. Analyses of selected student work samples
provide further evidence of student learning at high levels (Table 4), and comparisons of these showed
special needs students’ work at levels equal to average students.
The high quality of student work, as previously noted, was remarked upon by all participating
teachers. All teachers who had special needs students in their classes particularly commented on the
“leveling of the playing field” for these students. Traditionally the special needs literature describes the use
of assistive technology tools for supporting meaningful mainstreaming of struggling students or the use of
intervention-based software to facilitate learning. In the SBCAC classroom, however, the students with
special needs and lower abilities were achieving using the same technology tools as their peers rather than
assistive technology. This finding has important implications for teachers and administrators with regarding
to student integration and accommodation issues. It clearly deserves further investigation.
431
Table 3 Average Pre/Post Test Gains in Effect Sizes by Grade Levels*
Effect Size of Effect Size of
Grade Level Gains 2003/04 Gains 2004/05
K 0.52
grade 3 1.41
grade 4 2.03
grade 5 1.06
grade 6 0.21
grade 7 1.46
grade 1 1.59
grade 2 1.61
grade 3 2.29
grade 5 2.75
grade 5 3.84
grade 6 1.30
* test scores were only available for one kindergarten and one fourth grade class in each
cohort
Table 4 Average Conceptual Understanding Scores for Ability Groups Across Classes*
Average Score Across
Classes
high ability 10.0
medium ability 9.4
low ability 8.5
special needs 9.3
* overall scores out of a possible range of 3 (low) to 12 (high)
Student work samples were further studied to yield descriptive data regarding student
performance. A number of the artifacts studied required students to utilize technology to organize,
synthesize, or interpret information, describe patterns, create models or simulations, etc., suggesting that
teachers were making use of digital technologies as tools for supporting higher order learning. In most of
the artifacts, there was good evidence that students had developed a deep understanding of key concepts
and ideas related to the content area they were studying, in that they were able to elaborate on specific
concepts and make connections between concepts. In addition, the majority of the work samples
encompassed details and examples in ways that demonstrated students’ ability to communicate their
learning including supporting details, facts, graphics, and symbolic representation.
Uses
In our work on ubiquitous computing, we use the term “uses” to refer to the activities and
interactions through and around which knowledge is negotiated and constructed. As pedagogical
possibilities change in ubiquitous computing environments, new social organizations evolve around new
approaches to teaching and learning. By examining interactions among teachers and students in a
ubiquitous computing classroom, we can begin to explore how classroom cultures are changing in response
to such possibilities.
Indeed, comparisons of teacher and student activities and the organization of interactions among
students and teachers in their regular classrooms with activities and social organization in the SBCAC
revealed meaningful differences between settings. The most noticeable difference involved student
groupings. In the SBCAC, students spent more than half their time working in small groups, while they
spent less than a third of their time similarly engaged in their regular classrooms. In contrast, students
spent almost 50% of regular class time, engaged in whole class activities, while in the SBCAC, they spent
less than one third of their time working as a whole class.
432
Accordingly, teachers spent over two thirds of their time at the front of their classes in their
regular classrooms, whereas in the SBCAC they alternated their time between teaching from the front of
the room, orchestrating presentations from the teacher station and moving among students. These data
together with findings comparing teacher and student activities in the two settings suggest a tendency for
teachers to become more “facilitators of learning” in the ubiquitous computing classroom than “disbursers
of knowledge.” For example, teachers were much more likely to spend time lecturing, and asking and
answering questions in their regular classrooms than in the ubiquitous computing classroom. They also
spent a good deal more time on classroom management. On the other hand, in the SBCAC teachers were
much more likely to spend their time giving directions and demonstrations, supervising activities and
talking with their students than in their regular classroom.
Similarly, whereas students spent much more time in their regular classrooms answering and
asking questions than in the SBCAC, in the SBCAC they spent much more time talking and listening to
each other. In the SBCAC students spent fully four times as much time working on construction projects
than they did in their regular classrooms. In their regular classrooms they spent almost twice as much time
engaged in seat work.
Interviews with teachers support these observations. All teachers, for example, remarked on how
much easier it was to manage their classes in the SBCAC. One teacher who was nervous about bringing
her class to the SBCAC, remarked, “I learned classroom management is easier in technology-based
teaching, not harder.” Indeed, most teachers said they were surprised at the way they could work with
individual students without worrying about what the rest of the class was doing. One teacher, for example,
noted, “It’s much more student-centered there; the technology keeps them engaged so I can go around and
do one-on-one.” Other teachers pointed out that because management issues were reduced, they could give
their students more independence. One told us, “I tried to give the students more choices about projects
because of the different ideas I saw in the classroom.”
Teachers also commented on the way classroom dynamics changed in the ubiquitous computing
classroom. For example, one teacher reported,
“Students interacted more and more freely. Bullying stopped and the class achieved a sense of
itself, sooner she thought, than they would have in their regular classroom. At the beginning of the year, I
gave students cards on which they told who they would like to sit near. I just redid them and found that
they had changed dramatically. The SBCAC experience in some sense forced kids to interact with each
other.”
All teachers particularly noted that students who had been marginalized in their regular classrooms
took on new and more central roles in the SBCAC. In a particular, all teachers remarked on the way
ubiquitous computing seemed to “level the playing field” for students of varying abilities. For example,
one teacher stated,
“In particular, the special education students bloomed. They could go at their own pace and
technology seemed to emphasize their strengths as opposed to their weaknesses. It had a leveling effect.
One special education student’s autobiography was one of the best in class, much better than his earlier
work.”
All teachers also all found the SBCAC environment to be more supportive of collaborative
learning than their regular classrooms. For example, one teacher noted, “It also seemed to make individual
sharing and peer tutoring work better.” Several teachers used handheld computers to support peer editing
and found that students were much more enthusiastic about the process. One teacher’s comments illustrate
their observations,
“The biggest change has been in their weekly journals. We have been journaling all year and they
have always written them but in using the Danas, peer editing takes on so much more meaning when they
can beam to someone rather than trading papers. With the Danas they are editing their own writing more
and it keeps getting better.”
Another teacher noted that being able to share work on computer screens and over the presentation
systems gave students increased pride in their work, “The SBC experience also taught me the value of
sharing student work. Giving a grade for a project is not enough, students need peer affirmation of
performance.”
In their quasi-clinical interviews, many students mentioned another way ubiquitous computing
was changing the ways they created and used knowledge. They told us that mobile computing was
allowing them to take ubiquitous computing beyond the classroom. Students reported using handheld
computers not only in the SBCAC, but on field trips, back in their regular classroom, at home, after school,
433
and on the bus. Many of the students reported that they found them to be most useful for organizational
types of school-related activities. Most said that they preferred using mobile computers over writing things
by hand and that using them for writing assignments made the work “easier” and/or “more fun”.
Conclusions
In a recent article, Andy Zucker (2004) argued for the importance of rigorous research on
ubiquitous computing, which he distinguished as belonging to three large categories – research on critical
features of ubiquitous computing initiative, research on their ultimate outcomes (the purpose behind such
initiatives), and research on intermediate outcomes, on teaching and learning in ubiquitous computing
environments. The research presented here falls into this latter category. It documents changes in teaching
and learning in a ubiquitous computing classroom and suggests that these changes are related to the
supports such environments provide for new representations, conceptualizations, and uses of knowledge
afforded by such environments. It thus adds to our theoretical understanding of teaching and learning in
ubiquitous computing environments. Specifically, the classes we observed used multiple representations of
similar concepts and visual as well as textual representations to explore topics. Teachers in these classes
allowed students representational choice and encouraged student constructions and sharing in a variety of
representations including presentation to audiences beyond the classroom. We also observed changes in
interactions among students and between teachers and students that seemed to support the social
construction of knowledge. Finally, we found significant gains in conceptual understanding among
students. Of particular interest in this regard were findings which suggest learning in ubiquitous computing
environments may help close gaps in academic gains between special needs and regular students. These
results clearly deserve further investigation.
References
Apple Computer (1995). Changing the converstation about teaching, learning and technology: a report on
10 years of ACOT research. Retrieved April 8, 2005 from
http://images.apple.com/education/k12/leadership/acot/pdf/10yr.pdf
Bartels, F. & Bartels, L. (2002). Reflections on the RCDS laptop program after three years. Retrieved April
8, 2005 from http://www.learningwithlaptops.org/files/3rd%20Year%20Laptop%20Prog.pdf
Danesh, A., Inkpen, K., Lau, F., Shu, K., & Booth, K. (2001). GeneyTM: Designing a collaborative activity
for the PalmTM handheld computer. Proceedings of CHI, Conference on Human Factors in
Computing Systems. Seattle: WA.
Dede, C. (2000). The Role of Emerging Technologies for Knowledge Mobilization, Dissemination and Use
in Education. Fairfax, VA: Office of Educational Research and Improvement, US Department of
Education. Retrieved April 8, 2005 from http://www.virtual.gmu.edu/EDIT895/knowlmob.html
Hennessy, S. (2000). Graphing investigations using portable (palmtop) technology. Journal of Computer
Assisted Learning 16, 243-258.
Hill, J. R., Reeves, T. C. & Heidemeier, H. (2000). Ubiquitous computing for teaching, learning and
communicating: trends, issues and recommendations. Retrieved April 8, 2005 from
http://lpsl.coe.uga.edu/Projects/AAlaptop/pdf/UbiquitousComputing.pdf
Hill, J. R., Reeves, T. C., Grant, M. M., Wang, S-K. & Han, S. (2002). The impact of portable technologies
on teaching and learning: year three report. Retrieved April 8, 2005 from
http://lpsl.coe.uga.edu/Projects/aalaptop/pdf/aa3rd/Year3ReportFinalVersion.pdf
Honey, M. & Henriquez, A. (2000). More things that do make a difference for youth. Union City School
District, NJ. Retrieved April 8, 2005 from http://www.aypf.org/compendium/C2s18.pdf
Inkpen, K. (2001). Designing handheld technologies for kids. Personal Technologies Journal 3, 81-89.
Proceedings of CHI, Conference on Human Factors in Computing Systems. Seattle: WA.
Janesick, V. J. (1994). The dance of qualitative research design: Metaphor, methodology, and Meaning.
In N. K. Denzin, & Y.S. Lincoln (Eds.), Handbook of qualitative research. pp.209-219.
Thousand Oaks, CA: Sage.
Johnstone, B. (2003). Never Mind the Laptops. New York: iUniverse, Inc.
Lincoln, Y. S., & Guba, E. G. (1985). Naturalistic inquiry. Newbury Park, CA: Sage.
Mandryk, R. L., Inkpen, K. M., Bilezkjian, M., Klemmer, S. R., & Landay, J. A. (2001). Supporting
children’s collaboration across handheld computers. Proceedings of CHI, Conference on Human
Factors in Computing Systems. Seattle: WA.
McClintock, R. (1999). The Educator’s Manifesto: Renewing the Progressive Bond with Posterity through
434
the Social Construction of Digital Learning Communities. New York: Institute For Learning
Technologies, Teachers College, Columbia University. Retrieved March 21, 2005, from
http://www.ilt.columbia.edu/publications/manifesto/contents.html
Newmann, F.M., & Wehlage, G.G. (1995). Successful School Restructuring: A Report to the Public and
Educators. Madison, WI: Document Service, Wisconsin Center for Educational Research.
Norris, C. & Soloway, E. (2004). Envisioning the handheld centric classroom. Journal of Educational
Computing Research, 30 (4), 281-294.
Papert, S. (1980). Mindstorms: Children, Computers and Powerful Ideas. New York: Basic Books.
Ricci, C. M. (1999). Program evaluation: New Your City Board of Education Community School District
Six laptop project. Montreal: Paper presented at the Annual Meeting of the American Educational
Research Association.
Robertson, S. I., Calder, J., Fung, P., Jones, A., O'Shea, T., & Lambrechts, G. (1996). Pupils, teachers, and
palmtop computers. Journal of Computer Assisted Learning 12, 194-204.
Rockman, S. (2003). Learning from Laptops. Threshold, 1(1), 24-28.
Roschelle, J. (2003). Unlocking the value of wireless mobile devices. Journal of Computer Assisted
Learning 19, 260-272.
Roschelle, J. & Pea, R. (2002) A walk on the WILD side: how wireless handhelds may change computer-
supported collaborative learning. International Journal of Cognition and Technology, 1 (1), 145-
272.
Roschelle, J., Penuel, W. R., & Abrahamson, L. (2004). The networked classroom. Educational
Leadership, 61 (5), 50-53.
Roth, J. (2002). Patterns of mobile interaction. Personal and Ubiquitous Computing 6, 282-289.
Russell, M., Bebell, D., & Higgins, J. (2004). Laptop learning: a comparison of teaching and learning in
upper elementary classrooms equipped with shared carts of laptops and permanent 1:1 laptops.
Journal of Educational Computing Research, 30 (4), 313-330.
Sharples, M. (2000). The design of personal mobile technologies for lifelong learning. Computers and
Education 34, 177-193.
Siegle, D. & Foster, T. (2000). Effects of laptop computers with multimedia and presentation software on
student achievement. New Orleans, LA: Paper presented at the Annual Meeting of the American
Educational Research Association.
Silvernail, D. L., & Lane, D. M. M. (2004). The Impact of Maine's One-to-One Laptop Program on Middle
School Teachers and Students (Report #1). Gorham, ME: Maine Education Policy Research
Institute, University of Southern Maine Office.
Stevenson, K. R. (1998). Evaluation report: year 2: Schoolbook Laptop Project. Retrieved April 8, 2005
from http://www.beaufort.k12.sc.us/district/ltopeval.html
Swan, K. Kratcoski, A. Diaz, S. van ‘t Hooft, M. & Juliana, M. (April, 2004). “Exploring a theoretical
model of student learning in technology rich classrooms.” San Diego, CA: Paper presented at the
Annual Meeting of the American Educational Research Association.
Tatar, D., Roschelle, J., Vahey, P., & Penuel, W. R. (2003). Handhelds go to school: lessons learned. IEEE
Computer, 36 9), 30-37.
Vahey, P. & Crawford, V. (2002). Palm Education Pioneers Program: final evaluation report. Menlo Park,
CA: SRI International. Retrieved April 8, 2005 from
http://www.palmgrants.sri.com/PEP_Final_Report.pdf
van ‘t Hooft, M., Díaz, S. & Swan, K. (2004). Examining the potential of the handheld computers: findings
from the Ohio PEP project. Journal of Educational Computing Research, 30 (4), 295-311.
Weiser, M. (1991). The computer for the 21st century. Scientific American, 265 (3), 94-95,98-102.
Weiser, M. & Brown, J. S. (1996). The coming age of calm technology. Retrieved March 8, 2004 from
http://www.ubiq.com/hpertext/weiser/acmfurture2endnote.htm.
Zucker, A. (2004). Developing a research agenda for ubiqutous computing in schools. Journal of
Educational Computing Research, 30 (4), 371-386.
Zucker, A. A. & McGhee, R. (2005). A study of one-to-one computer use in mathematics and science
instruction at the secondary level in Henrico County Public Schools. Washington, DC: SRI
International.
435
Information Management Programs at Anadolu University, Turkey
Deniz Tasci
Cengiz Hakan Aydin
Anadolu University, Turkey
Abstract
This presentation consists of the results of a study in which facilitators’ attitudes toward
effectiveness of various media used in the Information Management Associate Degree Program of Anadolu
University, Turkey. The study has shown that although facilitators indicated that textbooks should still be
used in online courses, they found textbooks as being not efficient as multimedia programs and web
environments. The participant facilitators also found multimedia programs distributed on CDs more
efficient than web environment.
436
worked for the preparation of teacher/learner guides developed for the courseware under the project of
Ministry and she also had been in the teacher training activities as an instructor.
The following year The Ministry of Education seized the department’s activities after new
government took over. As a result the number of people working in the department was decreased. This
actually took place as drop-outs from the job. Some left upon seeing that there is no academic future in the
department works and some left since there was no job guarantee. Actually this was the case in the previous
years but since there was a big staff circulation it was not felt that much. After 1992 there was no
employment at the department for a long time, so the number of the staff was very limited.
In 1993 and 1994 an original project was developed and started in order to diversify Open
Education Faculty services. As a result, computer labs were established in various parts of the country of
which each held 20-30 computers. In these labs, distance learners started studying the course materials that
CBI department developed. A large number of these labs called Computer Assisted Academic Advisory
Centers are still operating at the present.
At the same time the department continued to its services. Among these were the teacher training,
computer-assisted instruction courses for face-to-face learners at other schools, various information and
presentation software production, two instructional software productions for IBM Turkey, and last but not
the least the computer use efforts for disabled learners.
The main function of the department until the year 1997 was developing projects for lab
establishment in different cities, realizing these establishments and providing support activities, and
producing advisory courseware for those schools which give distance learning courses. After a two-year
bright period, 1992-1997 can be said to be a “non-productive” period for the department.
During this period, departments’ usual services and issues were continued. Mehmet Emin Mutlu
who is older than others and is more experienced comparing to others was assigned to the leadership of the
department. But since the department had no formal status on the campus it did not a formal management
and naturally a formal director either. Although Mr. Mutlu was drifted to this position by the willingness of
his colleagues at the department, he was seen as a representative of the department by others outside the
center.
Another thing that the department experienced at this time that it developed almost all the projects
itself. In other words, without any demand from the university or faculty management all the projects were
developed by the department and those which university management approved were implemented. In
addition, relations with other faculties, especially with education faculty were strengthened.
Again during the same period, the department showed great efforts to remain up-to-date in both
hardware and software issues. Mr. Mutlu had been experienced on this area and so he was assigned to
follow the developments in technology in parallel to department’s sensitiveness for developments. With
such efforts his place was also strengthened in the department. Following the new management takeover at
the university in 1999, the department moved in to its new place at Open Education Faculty building.
Before long, it started tryout tests via internet and experienced an unexpected amount of attention. But this
attention did not surprise them that much because the staff had closely observed the distance learners’
needs while they were developing Computer Assisted Academic Advisory Services.
In the following year, multimedia cds were produced for various courses and delivered to the
students along with the textbooks. Researches have shown that academic achievement of those who used
cds increased significantly (Mutlu, Ozogut-Erorta, & Yılmaz, 2004).
Before long, the courseware that was presented in the framework of the Computer Assisted
Academic Advisory Services was started to be offered online.
Today, Computer Assisted Instruction Department has 40 personnel in coherence with the eight-
person core team. The department has diversified the distance learning activities of the university a great
deal in 6 years. During this time, in courses at education faculty, successful students were observed and
given a chance to work as interns of whom some were employed to work full time later times.
On the other hand, the department has not a clear status yet neither in the university nor in the
faculty management. Meanwhile Mr. Mutlu completed his academic studies and was assigned to be Vice-
Dean in the Open Education Faculty. Thus the place of the department became clearer in the university but
it still does not have an organizational diagram that defines tasks and the jobs in the department.
Technically, there is no difference between Mr. Mutlu and any member of the core team in terms of status
and it’s the same situation for those new comers’ situations compared to experienced ones. But still a
hierarchy is sensed because of reciprocal respect and habits.
437
Information Management Associate Degree Program
Maybe the biggest project that the department developed was the Information Management
Program which is the first “associate degree” program offered online in Turkey (Mutlu & Aydın, 2004).
Advantages of the way the department has been organized may be seen better by taking a look at the
structure and development process of this program.
The idea of developing a program that will continue completely online goes far back to the first
years of the department. By then, neither the university nor the country in general had required technical
infrastructure to implement such a program, so the idea has been delayed for years but never put away on
shelves. When tryout exams started online, distance learners have been better analyzed and it was
recognized that more learners had access to internet than it was assumed. It was also found out that many of
those students had been reaching internet from their work places but not from their homes.
In the framework of formal and informal relations with those students, it was concluded that an
internet-based program was feasible and any program to be offered should consist a work-related content in
order for increasing this feasibility. This idea was gradually matured in the department and structured by
the end of 1999 as an internet-based two-year associate degree program. The program was presented to the
university management upon getting approval from the faculty management. After the approval of the
faculty, Microsoft-Turkey was contacted and with the cooperation of this institution program development
process was started.
Support from Microsoft-Turkey came to the project as providing the registered students with the
necessary software and software related booklets or guides with a discount. With this support textbook
writing problem had been solved on the other hand.
In the following weeks, course content of the two-year program was clarified in the department
and except the courses that are compulsory for any higher education learner in Turkey, courses such as
Operating Systems, Desktop Publishing, Spreadsheets, Organizational Communication, Group work, and
Multimedia Application were decided to be offered. Course instructors and advisors were determined at the
same time.
Later on the following months, instructional materials were developed with an intensive work load
and meanwhile the decisions were made about the operation of the program activities including the services
to be provided, time schedule, instructors, and so on. It was planned that each learner would be given 5
assignments during the first year, and these assignments were to be related to a virtual company and its
operations that was formed by course developers.
The name of virtual company formed was Anadolu Publication and it had operations same as any
real company would have. In terms of financial, logistics, human resources and any other related
operations, the department sought for help from other departments on the campus, but preferred to develop
the company virtually itself after some problems related to work load of the other people asked for help.
An important issue to note here is that the department used its own work force and initiative to
form such a virtual company so that the learners would do real world like projects. And of course this
increased the staff’s work load as a result.
Another point to note for is that during the time in which the idea of information Management
Program evolved in to a well structured program, all decisions were made collaboratively. Besides, acting
as a team in the department prevented competitive efforts and questions like “who made this decision?”
was regarded strangely.
The most important difference of the Information Management Associate Degree Program is that
all the works done are on a virtual company. This application gives learners a chance to apply their learning
on a simulated real life example and provide permanent learning experiences to the learners. It is well
known that such real life exercises provide richer learning outcomes. It is also well known that developing
such real life examples and simulations is a hard job to do. This hard job was realized in the Computer
Assisted Instruction Department in a very short time along with many other projects. The most important
factor that is required for the development of a project is that the required efforts for such a project were
shared among the department staff.
Although the program activities are based on a virtual company, the program has many other
sound characteristics. On every course each first year student is given 5 assignments and these assignments
are evaluated seriously. At the end each learner is given appropriate, elaborative feedback. Also, learners
feel free in terms of time and space having the ease of reaching materials and advisors outside the working
hours as well as reaching classmates anytime anywhere at their own ease. This helped the program
overcome the possible routine problems encountered in traditional learning environments.
438
Experts in the department, starting from the first year, had a chance to communicate to the learners
face-to-face during the exam periods (2) of Open Education Faculty and thus they had a chance to get
feedback from learners about the system. In fact, student assignments, advisor observations, and internet
based communication provided the program with important feedback so far. But face-to-face
communication with students during the exams provided valuable feedback which in no other case would
have been provided. Efforts related to this are still continuing.
As a result of the observations for two years, during the second year, student assignments were
decided to be a team work rather than being individual responsibility. Thus it was planned that a gap that is
characteristic to distance learning was to be filled. During the second year of the program, learners take
courses like Organizational Information Management, Planning and Control Tools, Internet Information
Systems, Data Base Management Systems, Advertising and Marketing Tools, Graphics Applications, and
Office Applications Development besides their higher education compulsory courses. For each of these
courses learners are to complete 5 assignments.
In both first and second year courses, learners are provided with video cds and textbooks as well
as internet-based instructional materials. Textbooks are provided by Microsoft Turkey, but all of the other
materials are produced, reproduced and distributed by the department. For the distribution process the
faculty’s logistics are used.
Upon completion of the first year activities, second year’s materials and activities were also
started to be designed. Feedback provided till then was used in designing new materials. For example, it
was decided at the end of feedback analysis that the assignments on virtual company should be related to
each other in a complementary manner.
Evaluation
Computer Assisted Instruction Department of Anadolu University is one of the rare learning
organizations in all around the world. In fact, the department must be very similar to those organizations
that experts like Argyris and Senge were thinking while they were developing learning organizations
theory. Hence, having a difference from efforts to overcome problems and to reach a better functioning
organization structure at the present organizations, the department provides valuable information about the
possibility of learning organizations.
It should be better stressed that while no other formal organizations met the requirements, the
department had experienced changes in the personnel number changing from 8 to 200 from time to time
and had got help from other faculties when necessary to direct 500 people to the same target, and had done
all of these without an organization diagram and job descriptions.
Also, a limited number of people who were educated ordinarily and had not much experience in
the area were able to realize quality works in nation wide in a short time when there was not enough know-
how in the field. To better understand the importance of the works of the department, one should note that
same projects in all around the world are done with many experienced and qualified personnel in longer
periods of time. Excluding the option that such people with very special characteristics came together
accidentally we can infer that this is a result of the organization style of the department.
Computer Assisted Instruction Department, on the other hand, served to the field by producing
knowledge and experience which the country did not have beforehand and thus determined norms almost
every phase. In other words the department in its 15 year history has become an organization that feeds
outside environment rather than being an organization that was fed by outsiders.
For these reasons, if we evaluate the Computer Assisted Instruction Department as a learning
organization, we can easily say that it is more efficient and successful comparing to traditional organization
structure of the learning organizations. Especially, in a changing and wavy atmosphere of a new developing
field, it is maybe the most important strength of the department that it could use the energy that
environment produced to develop itself.
We now can take a look at the appropriateness of the structural characteristics of the department to
the characteristics expected in learning organizations determined by experts like Argyris and Senge. At this
point, it is necessary to review five factors determined as characteristics of learning organizations (Chawla
& Renesch, 1995; Davenport & Prusak, 1998): Individual competence, intellectual models, shared vision,
learning as teams, and system thought.
In terms of individual competence in the department, this concept should be handled somehow
differently. Almost everybody among the personnel is not graduated from a well-known school in Turkey.
It can not be said that they were graduated from top schools at the university as well. They are also not that
439
bright students graduated with degrees from their schools. Roughly, any institution that is to establish
distance learning program today could hardly choose the present Computer Assisted Instruction
Department personnel. But the value of the projects they developed, the quality of the papers and articles
they presented at scientific platforms and knowledge they posses when talked to show that each of the
personnel is competent about the use of computers as an instructional tool.
Senge indicates a correlation between individual competence and “learning effort” (Senge, 1992).
At this point, department personnel are in a continuous learning process. Everyone working at the
department has a natural motivation in order for following the developments in relation to the department
operations. During the in-service training activities for 15 days every year, department staff helps each
other’s learning. But more surprisingly, almost everyone in the department can start making use of one
another’s learning in a short time. Thus, in spite of rapid technological changes, they were able to remain
up-to-date as a whole.
According to what one of the staff told and what many of the others remembered at the same time,
during the first year while they were attending to a meeting, one of them disagrees with another one saying
“nonsense” to his opinion. One of the formers of the department disapproves his behavior and says: “It is
free to speak nonsense here, but it is forbidden to say “nonsense” to anyone else.” We can see even today
that department personnel have conserved their intellectual flexibilities upon having grown in such a
culture since the beginning. At this point it can be said that intellectual patterns and assumptions Senge
talks about (Senge, 1992) are not effective on the Computer Assisted Instruction Department. Hence,
intellectual models discipline can be told to exist at the department.
In the process Information Management Associate Degree Program was designed and
implemented, a lady from the core team of the department stayed away for a while from the works upon
having a baby. When she returned to department later on, she adapted herself easily in to a project that was
just started. There are so many other values in the department that the staff has been sharing and the
existence of such values helped that lady to participate in Information Management program easily.
When any new project is in question, the staff takes expectations and needs of those in the project
into consideration. According to traditional organization structure understanding, the work that the
organization is going to do is determined beforehand. Whether it is a service or product, the production
characteristics are determined by experts. Tasks and steps to be taken in the process and individual
responsibilities are determined explicitly in advance. Thus, everyone in the traditional organizations
performs tasks and responsibilities determined by some other people according to a plan and they take their
positions in the project according to this plan. On the contrary, in the Computer Assisted Instruction
Department, everyone takes positions in coherence with each other with no other effects. Everyone tries to
fill one another’s gap and to fulfill the shared standards on the quality of the product. In other words, all
processes of the department are “implicit” processes.
From this perspective, it can be said that there is a common point between the shared vision of the
department and the learning as a team. Human brain is also consists of neurons which have no explicit
plans and each taking positions according to others. Accumulation of the implicit relations among neurons
by time means the whole brain learning. And this learning is the department’s whole learning rather than
people learning individually in the department.
It can be said that the Computer Assisted Instruction Department is not one that evolved to be a
“learning organization” by time but rather it was established to be a learning organization with its
formation and functions from the beginning. The most important structural difference that makes the
department a learning organization is the fact that it was structured to function by “implicit” relations
among staff rather than explicit plans. This helped the department have a shared vision in a short time,
learn together, produce individual competences from shared learning and other synergic effects, change its
environment and collect feedback from this environment, and change itself to new developments.
When compared to its equivalents in the world, Computer Assisted Instruction Department can be
seen as a necessity at least for some sectors taking into consideration that it was able to realize productions,
to help the field with its know-how and to develop standards with a limited budget and personnel.
References
Chawla, S. & Renesch, J. (Eds.) (1995). Learning organizations: Developing cultures for tomorrows’s
work place. Portland: Productivity Press.
Davenport, T.H. & Prusak, L. (1998). Working knowledge: How organizations manage what they know.
440
Boston: Harward Business School Press.
Mutlu, M. E., Ozogut-Erorta, O., & Yılmaz, U. (2004 October). Efficiency of e-learning in open education.
Paper presented in the First International Conference on Innovations in Learning for the Future: e-
Learning. Istanbul, Turkey.
Mutlu, M.E. & Aydın, C.H. (2004). The Information Management Associate Degree Program: The First
Internet-Based Distance Education Experience in Turkey, Proceedings of the 11th Distance
Education Conference, Houston, retreived from: http://www.cdlr.tamu.edu/dec_2004/
Senge, P. (1992). The Fifth Discipline. Istanbul: Yapı Kredi Publications.
Footnotes
1. This particular attempt happened in my office. I witnessed the most amount of the history here in this
part. Also, I did interviews recently with related people in order to prepare this paper. As a resource to the
summarized history in this part following address can be visited:
http://www.bdeb.aof.edu.tr/bde/index.html
2. In order for diplomas can be valid in Turkey, exams must be given face-to-face. For this reason open
education Faculty holds three exams for each academic year throughout Turkey.
441
Effects of Computer Integration Training and Computer Literacy
Training on Preservice Teachers' Confidence and Knowledge Related to
Technology Use
Jeremy I. Tutty
Boise State University
James D. Klein
Howard Sullivan
Arizona State University
Abstract
The purpose of this study was to examine the relationship between computer technology training
experiences and preservice teachers’ confidence and knowledge. Partcipants enrolled in either a basic
computer literacy course or a computer integration course completed a survey developed to measure their
confidence and knowledge of computer skills and integration. Findings revealed that the preservice
teachers who completed courses on both computer literacy and computer integration had more confidence
for computer skills and integration than when they completed only one of the two courses. Results also
indicated that participants who completed the computer literacy course by itself or in combination with the
integration course had significantly more knowledge of productivity tools and basic operations than those
who did not complete the literacy course. Implications for training preservice teachers on how to integrate
technology are provided.
Introduction
Background
Preparing preservice teachers to use technology in the classroom is a daunting task facing teacher
training institutions. Preservice teachers should not only learn to operate various technologies, but also
how to use them effectively in the classroom (Brush, 1998; U.S. DOE, 2001). Research on technology
integration training indicates that preservice teachers are not being adequately prepared in educational
technologies. A decade ago, a report by the Office of Technology Assessment (OTA) showed that only
three percent of graduating teachers felt “very well prepared” to use technology in their teaching (U.S.
Congress, 1995). In 2000, the situation was unchanged. The U.S. Department of Education reported that
new teachers were still not being adequately trained to use technology (U.S. DOE, 2000).
Since these reports were released, the International Society for Technology in Education (ISTE)
(2002) developed the National Educational Technology Standards for Teachers (NETS-T). These
standards provide the fundamental concepts, knowledge, skills, and attitudes for applying technology in
educational settings.
In addition to these standards, many colleges and universities have revised how they train
preservice teachers to use technology. A survey by five of the six Regional Technology in Education
Consortia revealed that introductory courses in technology are relatively common, and that student skills
tended to mirror their exposure during training (ITRC, 1998). There has been some debate over whether
requiring an introductory technology course in preservice teacher preparation is the appropriate means to
achieve the desired technology competencies (Fox, Thompson, & Chan, 1996; ISTE, 1999; Leh, 1998;
Wetzel, 1993; Wenglinsky, 1998; Willis & Sujo de Montes, 2002). The opponents of a required course
argue that a single course characterizes computers as a non-integral part of instruction, and that technology
should be integrated across all teacher education courses (Fox, Thompson, & Chan, 1996). A study
conducted by the ISTE questions the effectiveness of these courses: “We assumed formal course work
would lead to the ability to integrate technology into instruction; this is not the case” (ISTE, 1999, p. 20).
Proponents argue that preservice teachers do not learn basic technology literacy skills in their
teacher education programs without a dedicated literacy course (Dugger, 2001; Leh, 1998; Simonson &
Thompson, 1997; Wright & Shade, 1994). Others point to the positive impact of basic computer literacy
on the attitudes and self-efficacy of preservice teachers toward technology (Savenye, 1993). Willis & Sujo
de Montes (2002) suggest that: “One answer may lie in implementing a…skills course in addition to (an
442
integration course). In this way, the first course would focus on technology skills, while the second course
would focus on technology integration into the curriculum” (p. 80).
The purpose of the current study was to investigate the effects of two such courses (Computer
Literacy and Computers in Education) on preservice teachers’ self-reported computer integration
confidence and knowledge.
Confidence
Confidence is concerned with the judgment of what one can do with whatever skills one
possesses. If teachers are to integrate technology into their teaching, they must feel confident in using it
(Ertmer, 1994; Wetzel 1993). Positive attitudes toward technology are common among preservice teachers
(Karsten & Roth, 1998; Knezek & Christensen, 1998; Ropp, 1999; Selwyn, 1997; Wenglinsky, 1998), but
few teachers consider themselves ready to teach with technology. Several studies have found positive
effects for instruction on computer confidence (Ropp, 1999; Yildirim, 2000).
Knowledge
In addition to confidence, teachers must possess basic knowledge and skills required to operate
and integrate technology (Brush, 1998; Leh, 1998; U.S. DOE, 2001). Both the National Council for
Accreditation of Teacher Education (NCATE) and ISTE specify that, “Teachers should be able to
demonstrate a sound understanding of technology operations and concepts” (ISTE, 2002 p. 9). According
to Trotter (1999) many classroom teachers still do not have adequate technology skills. The literature is
replete with examinations of attitudes toward computers, computer anxiety, computer self-efficacy,
computer coping strategies, and required competencies (Delcort & Kinzie, 1993; Flowers & Algozzine,
2000; Hudiburg & Necessary, 1996; Karsten & Roth, 1998; Savenye, 1993). However, research that
assesses computer integration knowledge is sparse. Fields, Millard-Mann, & Waryanka (2000) indicate
that the need for quality assessment tools, which measure the technology proficiency of teachers is great:
“Preparing technology-savvy, academically sound teachers requires proper assessment to ensure that
technology will be effectively integrated into the schools” (p. 7).
Method
Participants
The participants for this study were 180 preservice teachers enrolled in either a computer
integration or computer literacy course at a large university in the southwestern United States. The
computer integration course was required for students enrolled in one of nine initial teacher certification
programs. The computer literacy course satisfied a general studies requirement and was recommended for
students in the teacher certification program. The participants were predominantly Caucasian female
(85%) preservice teachers from all major content areas. The average reported computer use of the
participants was 7-10 hours per week.
Three groups of preservice teachers were used for comparisons. The groups were determined based upon
the combination of their technology training experience: computer integration course, computer literacy
course, or the computer integration course plus the computer literacy course.
Course Descriptions
Two different courses, Computer Literacy and Computers in Education, were the focus of this
study to examine the relationship of differing computer technology training experience and preservice
teachers’ computer integration confidence and knowledge. Computer Literacy introduces basic technology
skills in word processing, spreadsheets and web development. Assignments are related to the basic
function of each software package, productivity and data analysis. Computers in Education introduces
technology integration. Assignments include evaluation of educational software, lesson plans and the
development of a technology-integrated lesson plan.
Both courses are designed for learner-centered classrooms and are taught in a similar manner.
Instruction in both courses features illustrated lectures, in-class discussions, on-line research and
discussion, demonstrations, hands-on lab activities, and active student participation. Both are offered
through an Educational Technology program housed in the College of Education.
Computer Literacy. Computer Literacy is a general studies course in basic computing skills. In any
given semester, approximately 40%-60% of the course enrollment is preservice teachers. It is
443
recommended, but not required, that preservice teachers complete Computer Literacy before enrolling in
Computers in Education. Computer Literacy has two areas of concentration: general computing knowledge
and computer productivity applications. Students receive instruction in basic computer operation and in
word processing, spreadsheets and web development.
Class projects and activities are designed to develop skills in real world problem solving and data
analysis. The class also uses BlackBoard course management software to facilitate learner-centered
research groups. Students participate in an on-line research group on one of the following topics: computer
security, copyright for students, adaptive/assistive technology, or emerging technologies.
It is intended that students completing Computer Literacy will: approach computer-based tasks
with greater confidence, demonstrate sound file management, and select and use appropriate software to
find or present solutions
444
Figure 1. PTCIS Confidence Items
Item
Skills Topics:
Productivity Tools
• Performing a cut or copy and paste between documents
• Attaching files to email
• Developing a presentation with graphics and sound
• Sorting data in a database
• Using functions in a spreadsheet to perform calculations
Basic Operations
• Saving and retrieving files from a folder
• Accessing information on a CD-ROM, diskette or hard drive
• Accessing user settings: i.e. desktop wallpaper, screensaver, sounds
• Connecting peripheral devices: i.e. printer, pda, portable audio device
• Performing disk maintenance: i.e. disk defragmenter
Integration Topics:
Tool Application
• Communicating with peers via multiple electronic means i.e. email, discussion board/forum…
• Designing technology-enhanced lessons
• Evaluating instructional units that integrate technology
• Aligning objectives to national technology and content standards
• Discussing issues related to equitable access to technology in school
Professional Practice
• Using the Internet for lesson plan ideas
• Delivering a lesson with presentation software: i.e. PowerPoint
• Using a database in a discovery lesson for students
• Creating digital concept maps
• Writing a WebQuest
The knowledge subscale consisted of 20 multiple-choice questions distributed evenly among the two topic
categories of skills and integration. Items from each topic category were distributed randomly on the
survey. The reliability coefficient for the 20 knowledge items was .78.
Content of the knowledge items was aligned to ISTE performance indicators and to factors
identified by the National Survey on Information Technology in Teacher Education (Moursund &
Bielefeldt, 1999). The complete instrument is available on request from the first author.
Procedures
Three subgroups of 60 participants each were selected among preservice teachers enrolled in
Computer Literacy and Computers in Education: computer integration only (n=60); computer literacy only
(n=60); and computer integration plus computer literacy (n=60). The groups were selected to represent the
three different types of technology training experiences of the participants.
Members of the computer integration only group were participants enrolled in the Computers in
Education course who had not previously completed the Computer Literacy course and were not currently
enrolled in it. Thus, the training experience of this group consisted solely of computer integration training.
Members of the computer literacy only group were participants enrolled in the Computer Literacy course
that had not previously completed or were not concurrently enrolled in Computers in Education. Thus, the
training experience of this group consisted solely of computer skills training. Members of the computer
integration + computer literacy group were participants enrolled in the Computers in Education course who
had previously completed Computer Literacy. Thus, the training experience of this group consisted of both
computer integration and computer skills training.
The researcher contacted each course instructor via email and personally arranged to deliver and
collect the CIBSI from each instructor. Each instructor received a packet containing directions for
445
administering the instrument and sufficient copies for the instructor’s students. Course instructors
administered the instrument to all students in their classes near the end of the Fall 2004 semester.
Data Analysis
Mean scores were calculated for each topic category for the three respondent groups. Separate
one-way multivariate analysis of variance (MANOVAs) were conducted to analyze the survey scores of the
three groups for significant differences in confidence and in knowledge. Analyses of variance (ANOVAs)
on the two topic categories (skills and integration) were conducted as follow-up tests to each MANOVA.
The univariate ANOVAs were followed by post hoc analyses consisting of Dunnett C pair-wise
comparisons to identify significant differences between pairs of topic category scores. Alpha was set at .05
for all significance tests. A Pearson product-moment correlation coefficient was also computed between
the overall confidence and knowledge scores.
Results
Confidence
Table 1 shows the mean confidence scores by topic category and respondent group. The overall
mean confidence score for all groups and topic areas was 3.84 (5 = very confident to 1 = not confident at
all). Overall mean confidence scores by respondent group were 4.06 for the integration + literacy group,
3.78 for the integration only group, and 3.68 for the literacy only group. Participants had higher overall
confidence for computer skills (M = 4.02) than for computer integration (M = 3.66).
A one-way multivariate analysis of variance (MANOVA) conducted on the data in Table 1 yielded
a significant main effect for the three technology training groups, Wilks’s Λ = .87, F(4,352) = 6.56, p <.05.
ANOVAs conducted as follow-up tests yielded a significant effect for computer skills, F(2,177) = 3.50, p
<.05, and for computer integration, F(2,177) = 6.69, p <.05.
Post hoc analyses consisting of Dunnett C pair-wise comparisons yielded three significant
differences between groups. For the category of computer skills, the integration + literacy group reported
significantly higher confidence (M = 4.22) than the integration only group (M = 3.87). For the category of
computer integration, the integration + literacy group reported significantly higher confidence (M = 3.91)
than the literacy only group (M = 3.40). Furthermore, the integration only group reported significantly
more confidence (M = 3.69) than the literacy only group (M = 3.40) for items in the computer integration
category.
Knowledge
Table 2 shows the mean knowledge scores by topic category and respondent group. The overall
mean knowledge score for all groups and topic categories was 13.90 (70%) out of 20 items. Mean scores
on the overall test were 14.70 (74%) for the literacy only group, 14.65 (73%) for the integration + literacy
group, and 12.34 (62%) for the integration only group. Participants received higher overall knowledge
scores in the skills topic category (M = 7.94), and lower overall knowledge scores in the integration topic
category (M = 5.96). The integration only group scored lowest in both topic categories.
446
Table 2. Mean Knowledge Scores by Topic Category and Respondent Group*
Overall
Integration Literacy Integration Topic
Topic Category Only Only + Literacy Mean
Discussion
The purpose of this study was to examine the relationship between computer technology training
experiences and preservice teachers’ confidence and knowledge.
Confidence
Findings reveal that when preservice teachers complete courses on both computer literacy and
computer integration, they have more confidence for computer skills and integration than when they
complete only one of the two courses.
The higher confidence of students who took both courses indicates the importance of including the
combination of training experiences in preparing preservice teachers. Students in this study who received
both integration and skills training typically completed the computer literacy course before enrolling in the
integration course. This combination may have led to higher confidence scores because training on how to
integrate technology provided students with additional opportunities to use the computer skills they
acquired in the literacy course in an applied integration context. This opportunity was not available to the
students who received only one of the two training experiences. Several other studies suggest the
importance of providing applied practice in computer integration (Fox, Chan, & Thompson 1996; Wetzel,
1993; Wenglensky, 1999).
As expected, the findings of the current study also revealed that students who received only
integration training had significantly more confidence for computer integration than students who received
only literacy training. But contrary to expectations, students who received only literacy training did not
have significantly more confidence for computer skills than those who received only integration training.
This later finding does not support other research suggesting that when preservice teachers receive
computer literacy training, it leads to greater confidence in regard to skills items (ITRC, 1998; Karsten &
Roth, 1998).
A plausible explanation for the results of the current study may be found by examining the items
used to measure confidence. The confidence items were based on the National Education Technology
Standards for Teachers (NETS-T) developed by the International Society for Technology in Education
(ISTE, 2002). The integration items included questions about designing technology-enhanced lessons and
447
aligning objectives to national technology standards. The preservice teachers who completed the
integration course were expected to master these skills and were given instruction and assessment related to
them. Instruction on these topics should have led to increased confidence for students who completed the
technology integration course.
However, the items used to measure confidence for skills were more general than those used to
measure confidence for integration. The items for skills confidence included questions about basic
operations and tools such as attaching files to email messages and saving and retrieving files from a folder.
It is possible that a difference was not detected between the integration only and literacy only students in
skills confidence because the items referred to technology commonly used by, or at least familiar to, the
typical college student.
Knowledge
Preservice teachers in the current study who completed the computer literacy course by itself or in
combination with the integration course had significantly more knowledge of productivity tools and basic
operations than those who did not complete the literacy course. This result is not surprising since the
computer literacy course provided instruction and assessment on topics such as file management and
software applications.
In addition, the preservice teachers in the literacy only group had significantly more knowledge
for computer integration topics than those in the integration only group. This is contrary to expectations
considering the literacy only group was not trained on how to integrate computers and the finding that the
integration only group reported higher confidence for integration tasks than the literacy only group.
It is possible that preservice teachers experiencing only literacy training may lack exposure to the
integration vocabulary because their training focuses on general computing knowledge and computer
productivity applications. This may impact their integration confidence, but due to their knowledge of
technology tools, they are able to proficiently apply their knowledge to address integration tasks (Dugger,
2001; Leh, 1998; Simonson & Thompson, 1997; Wright & Shade, 1994).
Differences between the participants who enrolled in each class may also have contributed to the
results. At the university where this study was conducted, the computer literacy course is an elective while
the integration course is required. Students taking the computer literacy course may have done so because
of an interest or motivation towards technology that students taking only the required course do not
possess.
Conclusion
If teachers are to integrate technology into their teaching, they must feel confident in using it
(Ertmer, 1994; Wetzel 1993). In addition to confidence in using technology, teachers must possess the
basic skills required to operate and integrate technology (Brush, 1998; Leh, 1998; U.S. DOE, 2001).
The motivation for conducting the current study was to examine the preparation of preservice
teachers at one university with the expectation that the results would inform educational technology faculty
and administrators of the effectiveness of the current methods being used. The results do suggest certain
approaches that may be useful in directing teacher education programs in such a task.
This study seems to corroborate existing research that a single technology class focusing on either
computer literacy or computer integration may not be sufficient to adequately prepare preservice teachers
to integrate technology (ISTE, 1999; Willis & Montes, 2002; Willis & Tucker, 2001). The results for
knowledge seem to support the inclusion of a computer literacy course in the preparation of preservice
teachers. The general pattern of responses indicates that knowledge of tools is necessary to appropriately
select and apply technology in the classroom. Several states have recognized this need and implemented
technology competency standards and assessments for teachers (NASBE, 2003). While the results for
confidence indicate that in order to graduate teachers both confident and capable to integrate technology
into their teaching, they should have the opportunity to practice technology integration in an applied
environment.
One alternative to the format examined in this study might be a combination of the literacy and
integration courses. This arrangement would accommodate the teaching of basic skills in the context of
technology integration projects and assignments. A second alternative is that advocated by ISTE and
others of a totally infused teacher preparation program in which technology is integrated into all existing
courses and assignments (ISTE, 1999). This method is certainly more demanding, principally because it
requires more faculty members to be skillful with technology, and was not examined in this study. The
448
results of this study indicate that even in such an environment, a dedicated computer literacy course may
still be necessary.
This research is limited to the extent that the impact of the training experiences examined in this
study is not known for teachers in their own classrooms. Further research on technology integration
following these preservice teachers into their first year of teaching is needed to determine whether the
preservice educators who completed the computer literacy or computer integration course are actually using
technology in their classroom.
References
Brush, T. A. (1998). Teaching preservice teachers to use technology in the classroom. Journal of
Technology and Teacher Education, 6(4), 243-258.
Delcourt, M. A., & Kinzie, M. B. (1993). Computer technologies in teacher education: measurement of
attitudes and self-efficacy. Journal of Research and Development in Education, 27(1), 35-41.
Dugger, W. E. (2001). Standard for technological literacy. Phi Delta Kappan, 82(7), 513-517.
Ertmer, P. A. (1994). Enhancing self-efficacy for computer technologies through the use of positive
classroom experiences. Educational Technology Research & Development, 42(3), 45-62.
Fields, M., Millard-Mann, A., & Waryanka, D. (2000). A proven method of assessing technology
integration for teachers and students. Retrieved January 12, 2005, from International Conference
on Learning with Technology Web Site: http://www.lcl.org/iclt/2000/papers/151.pdf
Flowers, C. P., & Algozzine, R. F. (2000). Development and validation of scores on the basic technology
competencies for educators inventory. Educational and Psychological Measurement, 60(3), 411-
418.
Fox, L., Thompson, D., & Chan, C. (1996). Computers and curriculum integration in teacher education.
Action in Teacher Education, 17(4), 64-73.
Hudiburg, R. A., & Necessary, J. R. (1996, April). Coping with computer stress. Paper presented at the
meeting of the SIG/Stress in Education Group of the American Educational Research Association.
New York.
Instructional Technology Resource Center (1998, January). Integration of technology in preservice teacher
education programs: the Southeast and islands regional profile. Orlando, FL: College of
Education, University of Central Florida.
International Society for Technology in Education (2002). National educational technology standards for
teachers preparing teachers to use technology (1st ed.). Eugene, OR: International Society for
Technology in Education.
International Society for Technology in Education, & Milken Family Foundation (1999). Will new teachers
be prepared to teach in a computer age?. Santa Monica, CA: Milken Exchange on Education
Technology.
Karsten, R., & Roth, R. M. (1998). The relationship of computer experience and computer self-efficacy to
performance in introductory computer literacy courses. Journal of Research on Computing in
Education, 31(1), 15-24.
Knezek, G., & Christensen, R. (1998). Internal consistency reliability for the teachers' attitudes toward
information technology questionnaire. Presented at the Society of Information Technology &
Teacher Education. Retrieved September 1, 2004, from University of North Texas Web Site:
http://www.tcet.unt.edu/research/survey/tatdesc.htm
Leh, A. (1998). Design of a computer literacy course in teacher education. Technology and Teacher
Education Annual, 220-223. Retrieved November 12, 2004, from ERIC database (ED 421111).
Moursund, D., & Bielifeldt, T. (1999). Will new teachers be able to teach in a computer age?. Santa
Monica, CA: Milken Exchange on Education Technology.
National Association of State Boards of Education. (2003). Teachers' use of technology. Retrieved March
27, 2005, from Policy Information Clearinghouse Web Site:
http://www.nasbe.org/Educational_Issues/Policy_Updates/11_3.html
Ropp, M. M. (1999). Exploring individual characteristics associated with learning to use computers in
preservice teacher preperation. Journal of Research on Computing in Education, 31(4), 402-424.
Savenye, W. C. (1993, February). Measuring teacher attitudes toward interactive computer technologies.
Paper presented at the meeting of the Association for Educational Communications and
Technology. New Orleans, LA.
Selwyn, N. (1997). Students' attitudes toward computers: validation of a computer attitude scale for 16-19
449
education. Computers in Education, 28, 35-41.
Simonson, M. R., & Thompson, A. (Eds.). (1997). Educational computing foundations. Columbus, OH:
Merrill/Prince Hall.
Trotter, A. (1999). Preparing teachers for the computer age. Education Week, 19(4), 37-43.
U.S. Congress Office of Technology Assessment (1995). Teachers and technology: making the connection.
(OTA-HER-616). Washington, DC: U.S. Government Printing Office.
U.S. Department of Education (2000). Progress report on educational technology: state by state profiles
(3671). Washington, DC: U.S. Government Printing Office.
U.S. Department of Education. (2001). Preparing tomorrow's teachers to use technology. Retrieved
October 16, 2004, from http://www.pt3.org
Wenglinsky, H. (1998). Does it compute? the relationship between educational technology and student
achievement in mathematics. Princeton, NJ: Policy Information Center, Educational Testing
Service.
Wetzel, K. (1993). Teacher educator's use of computers in education. Technology and Teacher Education
Annual, 407-410.
Willis, E. M., & Sujo de Montes, L. (2002). Does requiring a technology course in preservice teacher
education affect student teacher's technology use in the classroom? Journal of Computing in
Teacher Education, 18(3), 76-80.
Willis, E. M., & Tucker, G. R. (2001). Using constructionism to teach constructivism: modeling hands-on
technology integration in a preservice teacher technology course. Journal of Computing in
Teaching Education, 17(2), 4-7.
Wright, J. L., & Shade, D. D. (1994). Young children: active learners in a technological age. Washington,
D.C.: NAEYC.
Yildirim, S. (2000). Effects of an educational computing course on preservice and inservice teachers: a
discussion and analysis of attitudes and use. Journal of Research on Computing in Education,
32(4), 479-495.
450
Developing and Evaluating an Interactive Multimedia Instructional
Tool: How Is the Learning Quality of Optometry Students Impacted?
Ling Wang
Bai-Chuan Jiang
Nova Southeastern University
Background
As multimedia teaching technologies become more widely advocated and employed in higher
education, researchers strive to understand the influence of such technologies on student learning.
Advances in technology enable pedagogical enhancements that some believe can revolutionize traditional
methods of teaching and learning (Gatlin-Watts, Arn, Kordsmeier, 1999). Studies of multimedia-based
instruction reported a variety of outcomes (Ehrmann, 1995; Feeg, Bashatah, & Langley, 2005; Frey, 1994;
Homer, Susskind, Alpert, Owusu, Schneider, Rappaport, & Rubin, 2000; Mayer, 1997; McKethan &
Everhart, 2001; Moreno & Valdez, 2005; Neuhoff, 2000; Sekuler, 1996; Smith, 1997; Smith & Woody,
2000; Sneddon, Settle, & Triggs, 2001; Welsh, 1993; Welsh & Null, 1991). When reviewed collectively,
these studies reported that advanced technologies, especially multimedia instruction, which often involve
introducing or enhancing the visual aspects of the presentation of course contents, created an active
learning environment, improved students’ performance, fostered positive attitudes toward learning complex
concepts, increased communication, and could be adapted to all learning styles and levels of instruction
(Harris, 2002). Researchers suggest that, compared with classes with a traditional teacher-leading approach,
those using multimedia are better liked by students and yield slight but statistically significant
improvements in student learning as measured by both student self-report and objective outcome testing
(Feeg, et al., 2005; Frey, 1994; Mayer, 1997; McKethan & Everhart, 2001; McNeil & Nelson, 1991;
Moreno & Valdez, 2005; Petty & Rosen, 1990; Sekuler, 1996; Sneddon, et al., 2001; Welsh & Null, 1991;
Worthington, Welsh, Archer, Mindes, & Forsyth, 1996). Such encouraging findings have precipitated the
adoption of these technologies on a widespread basis. Despite many studies suggesting that multimedia
instruction benefits students, there are also some that found no significant differences between multimedia
classes and traditional classes (Homer, Susskind, Alpert, Owusu, Schneider, Rappaport, & Rubin, 2000;
Lee, Gillan, & Harrison, 1996; Stoloff, 1995). Therefore, there is need to further educators’ understanding
of the effect of multimedia technologies on students’ learning quality.
The effectiveness of multimedia-based instruction was studied in a variety of disciplines, such as
Biology (Sneddon, et al., 2001), Nursing (Feeg, et al., 2005), Pediatrics (Homer, et al., 2000), Physical
Education (McKethan & Everhart, 2001), Science Education (Moreno & Valdez, 2005), Physics (Zacharia
& Anderson, 2003), Psychology (Lee, et al., 1996; Petty & Rosen, 1990; Stoloff, 1995; Worthington, et al.,
1996), etc. Each discipline has its own characteristics that may influence the evaluation of the effectiveness
of multimedia-based instruction. Therefore, it is not feasible to rely on the findings from the evaluation of
the effectiveness of a particular multimedia-based instruction when evaluating a discipline different from
the one that is reported in the findings. When it comes to the discipline of Optics, there are few studies that
specifically address the effectiveness of multimedia-based instruction in Optics. The only two studies that
studied the effect of using multimedia computer-aided programs in the broad field of Physics are not
directly relevant to the evaluation of the effectiveness of multimedia-based instruction in optics. One of
these two studies (Tao, 2004) was conducted to see how the use of collaborative learning mediated by
multimedia computer-assisted learning programs helped students improve understanding of image
formation by lenses. It was in the direct field of Optics, but what it examined was not the effectiveness of
the multimedia programs, but the nature of the collaborative learning carried by such programs. Another
study (Zacharia & Anderson, 2003) examined the effectiveness of multimedia-based instruction, an
interactive computer-based simulation, but it was designed to evaluate the effect of such instruction on
students’ understanding of Physics (e.g., the laws of force and motion). Moreover, a review of the literature
shows that there is so far no multimedia-based instruction program that has been produced for performing
laboratory inquiry-based experiments of Optics. Therefore, it is believed that it is imperative that such a
program be developed and its effectiveness be evaluated.
The combined outcomes of the majority of studies across disciplines indicated that multimedia-
based delivery systems offered ways to optimize the advantages and minimize the disadvantages of
451
traditional methods of teaching and learning. It is expected to be true in Optics. Optics labs are designed to
help students understand the basic concepts and their applications by setting up experiments, collecting
data, using data in calculations to identify unknown variables, and writing an adequate lab report. Many
factors, such as the limited time for setting up, the unavailability of materials, and the inflexibility of light
sources, limit the function of the labs in the traditional laboratories. The disadvantages elicited by these
factors can be potentially addressed with the use of multimedia-based delivery systems.
Methodology
Research Design
This research study used an experimental design. To be specific, it will be a pretest-posttest
control group experiment. The four sections of the participants were divided into two groups. For the first
part of the experiment, one group (two lab sections) was used as the experimental group and was given the
treatment, i.e., using the virtual lab tool, and the other group (the other two lab sections) was used as the
control group and did not receive the treatment, i.e., conducting the lab in the traditional laboratory. For the
second part of the experiment, the two groups switched the role, namely, the group that received the
treatment before conducted their lab in the traditional way and the group that did not receive the treatment
before used the virtual lab tool.
Participants
Participants were students (112 for Part I of the study and 103 for Part II) enrolled in the first year
optometry program in 2005 at a private university in the southeast of America. The students were
somewhat randomly assigned into four sections (A, B, C, and D) upon registration for Basic Optics by the
university administration. The assignment was somewhat random because all students who enrolled into
the five-year program (instead of the regular four-year program) were deliberately assigned into Section D
due to administration eases. Except for this deliberation in section assignment, the other students were
randomly assigned into the four sections using a systematic method based on their last names.
452
reflected rays, refracted rays, etc., just like what the users can see in a traditional lab. The program also
calculates the angles of all the different rays generated, such as the reflection angle, refraction angle, angle
of deviation, and angle of dispersion. All the calculated results are shown in the “Results Table” in the
program as soon as the user makes one trial of the experiment. The first lab contains 3 experiments and the
second lab contains 2 experiments.
Data Collection
In the Optometry program, Basic Optics is a major basic course. All four sections (A, B, C, and
D) of the Basic Optics were taught by the same instructor. For the first part of the study, Section A and
Section D were used as the control group and conducted their lab experiments in the traditional laboratory
and in the tradition way. Section B and Section C were used as the experimental group and conducted their
lab experiments in a computer lab using the Virtual Optics Labs tool developed by this study. For the
second part of the study, Section A and Section D became the experimental group and was given the
treatment, i.e., using the virtual lab tool, and Section B and Section C were the control group and did not
receive the treatment, i.e., conducting the lab in the traditional laboratory.
The study was conducted during the fall semester of 2005. Two weeks after the start of the
semester, students in each lab section was given a diagnostic test (i.e., the pretest) to examine their basic
Optics knowledge level before the study. The test consisted of 12 questions with a total score of 10 points.
During the fifth and the seventh week, the two parts of the study were conducted respectively. The
experimental group conducted their lab experiments using the Virtual Optics Labs tool in a computer lab
and the control group conducted their lab experiments in the traditional Optics laboratory. After the
completion of the lab experiments, both the experimental group and the control group received a quiz (i.e.,
the posttest) examining their learning outcomes of the specific Optics labs. The test of the first part of the
study consisted of 6 questions with a total score of 100 points, and the test of the second part of the study
consisted of 12 questions with a total score of 100 points. Also at the end of the lab experiments, the
experimental group completed an open-end questionnaire (see Appendix) evaluating their perceptions of
the experiences with the use of the Virtual Optics Labs.
Results
Effects of Virtual Labs
Pretest results. Means and standard deviations for each lab section were first calculated for
pretest measures. Analysis of Variance (ANOVA) was then used to determine between-conditions
differences on pretest measures. The results (in Table 1) showed no significant between-conditions
difference which means the four sections were not significantly different from each other before the
treatment.
In order to see if the experimental group (Section B and C) and the control group (Section A and
D) are significantly different from each other before the treatment, another ANOVA was run to examine
the between-conditions difference of the two groups. Again, the results (Table 2) showed that the two
groups were not significantly different from each other before the treatment.
453
Table 2. Pretest Measures of the Two Groups
Part I results. For the first part of the study, 112 students of the total of 116 students in the
program participated. Part I of the study focused on the first lab in the Virtual Optics Labs.
Since the ANOVAs of the pretest measures did not indicate pre-existing differences among the lab
sections or between the experimental groups, another set of ANOVA was run on the posttest data to
determine the differences between the experimental group and the control group in their learning outcomes
due to the treatment, i.e., the use of Virtual Optics Labs. The results of the ANOVA on posttest scores
(Table 3) did not show any significant difference between the experimental group and the control group.
Table 3. Posttest Measures of the Experimental Group and the Control Group (Study Part I)
Part II results. For the second part of the study, 103 students of the total of 116 students in the
program participated. Part II of the study focused on the second lab in the Virtual Optics Labs. The
students who used to be in the experimental group during Part I formed the control group now. The
previous control group now became the experimental group.
Another set of ANOVA was run on the posttest data collected after the completion of the second
lab to determine the differences between the experimental group and the control group in their learning
outcomes due to the treatment, i.e., the use of Virtual Optics Labs. The results of the ANOVA on posttest
scores (Table 4) did not show any significant difference between the experimental group and the control
group.
Table 4. Posttest Measures of the Experimental Group and the Control Group (Study Part II)
1. The multimedia program is user-friendly, reliable, easy to manipulate, and provides enjoyable
experiences;
2. The labs are time efficient: the users don’t need to set up the equipment and therefore they can
quickly get into the experiments, and have more time to focus on actually doing the lab
experiments. Also, there is no waste of time on easy calculations;
3. With the time saved from handling lab equipment, the instructor can focus on the questions or
illustration of concepts that will make learning more meaningful;
454
4. Accuracy and precision: the labs allow for less human errors, for example, in calculations, or in
setting up the variations for the experiments;
5. The labs allow for the use of wider range of data, such as more detailed angle variations, or using
the medium that is not normally available in traditional labs, therefore, the virtual labs can allow
for the results that are not normally achievable in traditional labs;
6. The lab results are instant: the users can almost try the ray from any single angle and get instant
results;
7. Better accessibility: users can choose when to do the lab experiments;
8. Multiple trials of the experiments allow the users to perceive trends in data, and therefore, get the
big picture of the theory in application;
9. Users can proceed with the experiments on their own pace; they can repeat the experiments as
many or as few times as they want;
10. The labs stimulate independent work and independent thinking. Independent work may be more
productive for some learners. Unlike team work, in independent work, individuals have the
opportunity to complete every aspect of the task;
11. The labs are cost-effective and can reach more learners: with these virtual labs, the schools that
can’t afford actual Optics labs can still providing learning to their students through the use of this
program;
12. Visual learners can benefit much from such experiences;
1. The labs may be overwhelming for the users who are not computer literate and don’t feel
comfortable navigating through the program;
2. The instructor may have to spend much time helping the students with the computer technical
questions;
3. Real world visualization of the actual Optics lab may be hard because of the lack of the tangible
aspect of a lab. The lack of the tangible aspect may affect the comprehension of the theory because
some learners learn things better only by doing them hands-on;
4. Users may miss the opportunity to learn about the technical aspects of setting up the equipment;
5. Users may miss the opportunity to work as teams as they do in the traditional labs;
6. The Virtual Optics Labs have technical errors and computer glitches;
Theme 3: Important things to consider for the adoption of the Virtual Optics Labs
1. Make sure the users are computer literate and feel comfortable with the use of the type of
advanced technology involved in the Virtual Optics Labs;
2. Users should become familiar with the lab purpose, theoretical concepts, and procedures ahead of
time;
3. Make sure the directions in the Virtual Optics Labs are very clear;
4. Users should be directed not to rush through the program;
5. Allow students opportunities to see what incorrect results may happen in traditional labs;
6. Complement the virtual labs with the actual traditional hands-on labs;
7. Have better help/support available;
8. Always keep in mind that computer glitches could be present;
9. Make the virtual labs three dimensional;
10. Improve the program and produce better simulations;
11. Find a way to test the users’ understanding of the concepts being learned because commanding the
computer to do the work and then copying down the results generated by the computer may not
guarantee real understanding;
12. Investigate whether the virtual labs are at least as effective as the traditional labs in learning;
Discussions
The statistical analyses of the posttests of Part I and Part II of the study both showed no significant
differences between the experimental group and the control group. The no significant differences results
proved that the multimedia-based instruction for Optics labs is as effective as the traditional labs. The
455
multimedia-based instruction should be safely adopted by educational institutions to teach the Optics labs.
The specific benefits identified by the users of the multimedia-based instructional tool are of great
interest to the study. Mainly, the advantages and benefits can be grouped into 3 categories: usability
(Theme 1, Items 1 and 6), usefulness (Theme 1, Items 2, 4, 7, 9, and 11), bringing about better learning
quality (Theme 1, Items 3, 5, 8, 10, and 12). Although most of the users are consensus on the perceived
advantages categorized in these three themes, some people shared different perceptions over certain
aspects. For example, some people think that, with the virtual labs, users can get too tied up manipulating
the functions of the multimedia program and fail to relate the tasks performed on computer to the real
world applications. So it seems that some people think that the users may actually fail to look at the big
picture of the Optics theories in application.
There are also some specific disadvantages or challenges perceived by the users of the program.
The biggest challenge perceived is the challenge to the users who may not be comfortable with using the
advanced technology that is involved in the program and therefore, may feel lost during the use of the
program. Another important challenge involves the perception of the lack of the actual tangible experience
in the traditional Optics labs and some users think that this lack may affect the understanding of the lab
concepts. However, once again, there is mixed perception over the second challenge. Contrary to the
response regarding the lack of the tangible aspect of a traditional lab, some students believe that the hands-
on experiences provided by the Virtual Optics Labs are excellent and the experiences are so hands-on that
they helped them understand the labs very well.
The recommendations made by the users can also be categorized into several aspects: prepare the
users (Theme 3, Items 1, 2, and 8), prepare the program (Theme 3, Items 5, 9 and 10), smooth the use
(Theme 3, Items 3, 4, and 7), and evaluate the use (Theme 3, Items 6, 11, and 12). These recommendations
made by the users, as the first tryout group of this multimedia-based instructional tool, should be carefully
considered when educators or school teachers evaluate the potentiality of the program to be adopted by
their educational institutions.
Conclusions
This study developed an interactive multimedia-based Optics instructional software program. The
program is the first developed multimedia tool to be used for conducting Optics labs. The program only
consists of two Optics labs in its current version. More labs will be added to the program in the future.
The study evaluated the effectiveness of the program. The results show that the program is as
effective as the traditional Optics labs in terms of the student’s learning outcomes. The study also
investigated the users’ perceptions of the experiences of using the program. The advantages and challenges
were both discussed. More importantly, the study provided valuable recommendations for educators to
consider if they see the potential of adopting the program in their instruction.
For future research in this field, more Optics labs will be developed and the effectiveness of the
program will be further evaluated using different participants in order to gain some vision about the
potential populations for whom the program may be suitable and useful.
References
Ehrmann, S. C. (1995). Using technology in higher-education. In J. L. Morrison (Ed.), On the horizon: The
environmental scanning newsletter for leaders in education, 3(4), 12-13.
Feeg, V. D., Bashatah, A., & Langley, C. (2005). Development and testing of a CD-ROM based tutorial for
nursing students: Getting ready for HIPAA. Journal of Nursing Education, 44(8), 381-386.
Frey, D. K. (1994). Analysis of students’ perceptual styles and their use of multimedia. Perceptual and
Motor Skills, 79, 643–649.
Gatlin-Watts, R., Arn, J., & Kordsmeier, W. (1999). Multimedia as an instructional tool: Perceptions of
college department chairs. Education, 120(1), 190-196.
Harris, C. M. (2002). Is multimedia-based instruction Hawthorne revisited? Is difference the difference?
Education, 122(4), 839-843.
Homer, C., Susskind, O, Alpert, H. R., Owusu, C., Schneider, L., Rappaport, L. A., & Rubin, D. H. (2000).
An evaluation of an innovative multimedia educational software program for asthma management:
Report of a randomized, controlled trial. Pediatrics, 106(1), 210-215.
Lee, A. Y., Gillan, D. J., & Harrison, C. L. (1996). Assessing the effectiveness of a multimedia-based lab
for upper division psychology students. Behavior Research Methods, Instruments & Computers, 28,
295–299.
456
Mayer, R. E. (1997). Multimedia learning: Are we asking the right questions? Educational Psychologist,
32, 1–19.
McKethan, R, & Everhart, B. (2001). The effects of multimedia software instruction and lecture-based
instruction on learning and teaching cues of manipulative skills on preservice physical education
teachers. Physical Educators, 58(1), 2-13.
McNeil, B. J., & Nelson, K. R. (1991). Meta-analysis of interactive video instruction: A 10-year review of
achievement effects. Journal of Computer-Based Instruction, 18, 1–6.
Moreno, R., & Valdez, A. (2005). Cognitive load and learning effects of having students organize pictures
and words in multimedia environments: The role of student interactivity and feedback. ETR&D,
53(3), 35-45.
Neuhoff, J. (2000). Classroom demonstrations in perception and cognition using presentation software.
Teaching of Psychology, 27, 142-144.
Petty, L. C., & Rosen, E. F. (1990). Increase in mastery levels using a computer-based tutorial/simulation
in experimental psychology. Behavior Research Methods, Instruments & Computers, 22, 216–218.
Sekuler, R. (1996). Teaching sensory processes with multimedia: One of my teaching assistants is a mouse.
Behavior Research Methods, Instruments & Computers, 28, 282–285.
Smith, P. C. (1997). Psychology in the design of multimedia presentations in the classroom: An interview
with Richard S. Velayo. Teaching of Psychology, 24, 136-138.
Smith, S. M., & Woody, P. C. (2000). Interactive effect of multimedia instruction and learning styles.
Teaching of Psychology, 27(3), 220-223.
Sneddon, J., Settle, C., & Triggs, G. (2001). The effects of multimedia delivery and continual assessment
on student academic performance on a level 1 undergraduate plant science module. Journal of
Biological Education 36(1), 6-10.
Stoloff, M. (1995). Teaching physiological psychology in a multimedia classroom. Teaching Psychology,
22, 138-141.
Tao, P. K. (2004). Developing understanding of image formation by lenses through collaborative learning
mediated by multimedia computer-assisted learning programs. International Journal of Science
Education, 26(10), 1171-1197.
Welsh, J. A. (1993). The effectiveness of computerized instruction at the college level: Five suggestions for
successful implementation. Behavior, Research Methods, Instruments, and Computers, 25, 220-222.
Welsh, J. A., & Null, C. H. (1991). The effects of computer-based instruction on college students’
comprehension of classic research. Behavior Research Methods, Instruments & Computers, 23, 301–
305.
Worthington, E. L., Jr., Welsh, J. A., Archer, C. R., Mindes, E. J., & Forsyth, D. R. (1996). Computer-
assisted instruction as a supplement to lectures in an introductory psychology class. Teaching of
Psychology, 23, 175–181.
Zacharia, Z., & Anderson, O. R. (2003). The effects of an interactive computer-based simulation prior to
performing a laboratory inquiry-based experiment on students’ conceptual understanding of
physics. American Journal of Physics, 71(6), 618-629.
457
Intertwining the Fabrics of Learning Styles, Personality Types, Bloom’s
Taxonomy, and Multiple Intelligences
Rosalie Carter Ward
The University of Southern Mississippi
Silverstein wrote a poem, Invisible Boy. “And here we see the invisible boy In his lovely invisible
house, Feeding a piece of invisible cheese To a little invisible mouse. Oh, what a beautiful picture to see!
Will you draw an invisible picture of me?” (1974, p. 82). Just as you can only imagine what is going on in
the invisible house, we can only imagine what is in the minds of people with whom we work. Drucker
believes knowledge and understanding of behaviors of people and institutions are keys that will ultimately
determine society’s ability to perform tasks and produce results (1999). He also believes people are the
world’s number one resource (1999). If these two statements are true, individuals in design and
development need to understand as much as possible about people.
Since the qualities that distinguish individuals from one another are invisible, yet observable
through different behaviors, it is helpful to examine and combine the concepts of ability and intelligence
with three theories/models designed to create and describe pictures of you and me. Each of the three
assumes the existence of intelligence and generally defines intelligence as the ability or abilities allowing
individuals, in varying degrees of strength, skill, and limitation, to solve problems or create products that
are valued by a society or culture and to adapt to changing environments. Intelligence is also the idea of
having the ability to think which, as a mental activity, involves understanding, manipulation, and
communication. Individually, these theories/models have had profound effects on the development of
instruction. Each of them focuses on the process of learning and how “people absorb information, think
about information, and evaluate the results” (Silver, Strong, & Perini, 1997, p. 22). While some individuals
might say these theories/models demonstrate differences, I believe combining them and recognizing their
similarities creates a powerful tool to be used by the instructional designer.
Instructional designers create instructional materials and programs designed to meet the needs of
individuals in different professions. This design process is a rational, logical, and sequential process, which
deals with human experience, skills, and knowledge. It is intended to solve problems, initiate change, or
enhance what is occurring. For instructional designers, design is a discipline concerned with instructional
strategies; developing and implementing strategies; identifying problems and finding solutions; creating
detailed and specific criteria to assess learning; coordinating resources and procedures to facilitate training
and learning experiences; improving performance; and applying strategies and techniques derived from
behavioral, cognitive, and constructivist theories. When combined, Bloom’s taxonomy, Jung’s personality
types, and Gardner’s multiple intelligences, appropriately used, may greatly improve and enhance the
effectiveness and applicability of instruction and training programs. The purpose of this paper is to give a
brief overview of the theories/models, and how they can assist the instructional designer to be more client-
oriented and to generate greater interest in learning about personality types, multiple intelligences, learning
styles, and their usefulness and application to instructional design.
Jung’s “pioneering work on the nature and structure of the human psyche presented the concept of
healthy personalities having four developed functions: two perceptions, sensing and feeling, and two
judgment functions, thinking and feeling” (Shields, 1993, p. 2).
458
Figure 1. Description of the four functions. (Silver Strong & Thoughtful Education Press, 2004, p. 1)
The extent to which these four functions are integrated by the individual enables someone to
identify his or her primary functions, a process “known as individuation” (Ormrod, 1999, p. 330), which
determines how a person views and reacts to various situations and circumstances. Jung also identified two
attitudes toward life called introversion and extraversion that also influence personality traits and
behaviors. His theories “emerged from his observations of how people collected information, and how they
made judgments about that same information in terms of personal significance. A central theme in Jung’s
theory is that much apparently random variety in human behavior is due to the preferences for certain
functions over their opposities” (Hanson & Silver, 1996, p. 12). The combination of the functions results in
what we call learning styles.
459
Figure 2. Combined functions resulting in learning styles. (Silver Strong & Thoughtful Education Press,
2004, p. 7).
A visual summary of the four learning styles, sensing-thinking (mastery), sensing-feeling (interpersonal),
intuitive thinking (understanding), and intuitive-feeling (self-expressive) is illustrated Figures 2 and 3.
Figure 3. Description of Combined Functions to Create Learning Styles (Silver Strong & Thoughtful Press,
2004, Bloom’s
p. 10). taxonomy, primarily created for academic education, is relevant to all types of learning
Bloom’s taxonomy, primarily created for academic education, is relevant to all types of learning
(Chapman, 2005). The taxonomy “is in three parts or overlapping domains: cognitive domain (intellectual
capability); affective domain (feelings, emotions, attitude); psychomotor domain (manual and physical
skills)” (p. 4). Each of the domains is structured and sequenced creating hierarchies of development. The
cognitive domain hierarchy consists of knowledge, comprehension, application, analysis, synthesis, and
evaluation (Krathwohl, Bloom, & Masia, 1964). The affective domain consists of receive, respond, value,
organize or conceptualize values, and internalize or characterize values (1964). The third domain is
psychomotor. Its levels include perception, set, guided response, mechanism, complex overt response,
adaptation, and origination (1964).
460
meaning, explains, predicts, active participation greets, helps, mental, physical, shows, volunteers
translation, and rewrites, by learners; labels, performs, and emotional
interpretation of interprets, outcomes may practices, tells, sets
instructions and estimates emphasize presents, reports
problems compliance,
willingness to
respond
Application: applies, changes, Valuing: value or completes, Guided copies, traces,
applies what is construct, relates, worth attached to demonstrates, Response: early follows, reacts,
learned in the shows, uses, something; differentiate, stages in learning reproduces
classroom operates internalization of follows, forms, a complex skill;
specified values initiates, joins, includes trial, and
invites, justifies, error, and
selects, chooses, practice
shares, works
Analysis: analyze, Organization: adheres, alters, Mechanism: assembles,
separates compares, organizes values arranges, intermediate calibrates, fastens,
material or compares, into priorities by defends, orders, stage in learning fixes, grinds, heats,
concepts into diagrams, contrasting, formulates, a complex skill; manipulates, mixes,
parts; illustrates, infers, resolving conflict, integrates, learned responses mends, sketches
distinguishes outlines, separates creating unique modifies, become habitual
between fact and value systems; prepares,
inferences synthesizes,
combines,
generalizes
Synthesis: builds categorizes, Internalizing acts, qualifies, Complex Overt assembles, builds,
a structure of combines, devises, Values: value discriminates, Response: mixes, mends, plays,
diverse parts, designs, explains, system controls displays, solves skillful fastens, grinds,
creates a new generates, relates behavior; is influences, performance of heats, manipulates
meaning or revises, pervasive, modifies, motor acts that (words are like
structure reconstructs consistent, verifies, involve complex mechanism but the
predictable, performs, movement performance is
characteristic of the practices, patterns better)
learner proposes,
revises, serves,
Evaluation: appraises, Adaptation: adapts, alters,
make judgments compares, skills well changes, rearranges,
about the value concludes, developed; can revises, varies
of ideas or contrasts, defends, modify
materials describes, movement
discriminates, patterns to fit
interprets, special
justifies, requirements
summarizes,
Origination: arranges, builds,
creating new combines,
movement composes, creates,
patterns to fit designs, initiate,
situation: makes, originates
outcomes
emphasize
creativity and
highly developed
skills
Figure 4. Bloom’s Taxonomy and Its Three Parts: Cognitive, Affective, and Psychomotor. (Clark, 1999, p.
2-6)
Using these different categories, educators and designers can develop clear and specific learning
outcomes resulting in effective and comprehensive curriculums and assessment instruments for individuals
at various levels of development and skills. The descriptor words associated with each of the domains serve
461
as guides for the instructional designer creating instruction with certain expectations for completion and
performance.
Gardner’s Theory of Multiple Intelligences provides several potential pathways to learning.
Gardner identifies the following pathways as bodily-kinesthetic, (physical experiences or body smart),
interpersonal (social experiences or people smart), intrapersonal (self-reflection or self smart), logical-
mathematical (numbers and logic or number reasoning smart), musical (music smart), naturalistic
(experiences in the natural world or nature smart), verbal-linguistic (word smart), and visual-spatial (picture
smart) as well as existentialist (one’s relationship with God depending on one’s philosophy), moral (one’s
relationship with other living things and their well-being; ethics, humanity, value of life); and intelligence
type (capability and perception) (Armstrong, 2000; Chapman, 2005). Gardner indicates the intelligences are
independent because “they develop at different times and to different degrees in different individuals”
(Dickinson, 1998, p. 1). It is important to point out all intelligences are present in all people. Just as
individuals possess varying degrees of Jung’s four basic functions, they also possess varying degrees of the
intelligences. Chapman makes the point “well-balanced organizations and teams are necessarily comprised
of people who possess different mixtures of intelligences. This gives the group a fuller collective capability
than a group of identically able specialists” (p. 9). The Multiple Intelligences Theory is an effort to examine
and understand how cultures and disciplines shape human potential and mold their environment.
Figure 5 provides an overview of Gardner’s original multiple intelligences. Figure 5 indicates
where the strength of each intelligence lies, and how an individual preference learns best.
462
The question for us now is, “How are these theories/models useful to instructional designers”?
Silver and Hanson (1996) developed The Thoughtful Education Model, which combines Jung’s personality
types with teaching and learning styles. It outlines specific teaching, learning, instructional, and assessment
strategies. By using the information provided in Figure 6, you can easily determine which instructional
strategies and environments should be developed to accommodate the diversity of end users by using a
variety of teaching and learning styles and outlining specific teaching, learning, instructional, and
assessment strategies.
463
Sensing-Feeling Sensing Thinking Intuitive Thinking Intuitive Feeling
Teachers may be Teachers may be Teachers may be Teachers may be
characterized as: characterized as: characterized as: characterized as:
• Nurturers ▪ Trainers ▪ Intellectual ▪ Facilitators
• Supporters ▪ Information givers Challengers ▪ Stimulators
▪ Empathizers ▪ Instructional ▪ Inquirers ▪ Creators/Originators
managers ▪ Theoreticians
Learners may be Learners may be Learners may be Learners may be
characterized as: characterized as: characterized as: characterized as:
• Sympathetic ▪ Realistic ▪ Logical ▪ Curious
▪ Friendly ▪ Practical ▪ Intellectual ▪ Insightful
▪ Interpersonally ▪ Matter of fact ▪ Independence ▪ Imaginative
▪ Oriented
Curriculum objectives Curriculum objectives Curriculum objectives Curriculum objectives
emphasize: emphasize: emphasize: emphasize:
▪ Positive self ▪ Basic skills ▪ Critical thinking ▪ Creative thinking
▪ Socialization ▪ Acquisition of ▪ Concept ▪ Moral development
content development
Learning Learning Learning Learning environments
environments environments environments emphasize:
emphasize: emphasize: emphasize:
▪ Personal warmth ▪ Purposeful work ▪ Discovery ▪ Originality
▪ Interaction and ▪ Organization and ▪ Inquiry and ▪ Flexibility and
collaboration competition Independence Imagination
Instructional strategies Instructional strategies Instructional Instructional strategies
emphasize: emphasize: strategies emphasize: emphasize:
▪ Personal and social ▪ Behavior ▪ Information ▪ Self-expression
awareness modification processing ▪ Imagination
▪ Group projects ▪ Practice and drill ▪ Research ▪ Divergent thinking
▪ Personal sharing ▪ Convergent thinking ▪ Inductive reasoning ▪ Creative-artistic
▪ Oral reports tasks ▪ Written reports expression
▪ Communications ▪ Demonstrations ▪ Problem-solving ▪ Values clarification
▪ Producing products
Teaching strategies Teaching strategies Teaching strategies Teaching strategies
include: include: include: include:
▪ Group investigations ▪ Programmed ▪ Inquiry training ▪ Inductive learning
▪ Pair-share instruction ▪ Concept attainment ▪ Synectics
▪ Classroom meetings ▪ Command style ▪ Concept formation ▪ Information search
▪ Reciprocal learning teaching ▪ Reading for meaning ▪ Boundary-breaking
▪ Peer tutoring ▪ Mastery learning ▪ Use of Socratic (breaking mind sets)
▪ Sequencing faces ▪ Team games, methods of ▪ Analyzing and
▪ Lab training tournaments questioning working
▪ Semrad’s Steps ▪ Drill and repetition ▪ Problem-solving with moral dilemmas
▪ Pre-modeling ▪ Graduated difficulty ▪ Main idea ▪ Creative problem
▪ Team games ▪ Memorization ▪ Tangrams solving
▪ Tournaments ▪ Comprehensive
Planning
Assessment procedures Assessment Assessment Assessment procedures
include: procedures include: procedures include: include:
▪ Personal journals ▪ Objective tests ▪ Open-ended ▪ Fluency of expression
▪ Sociograms ▪ Checklists questions ▪ Flexibility of response
▪ Oral reports ▪ Behavioral objectives ▪ Essays ▪ Originality of
▪ Ranking procedures ▪ Use of mechanical ▪ Demonstration of response
▪ Trained observations devices abilities to apply, ▪ Elaboration of detail
▪ Collection of ▪ Demonstrations of synthesize, interpret, ▪ Development of
unobtrusive data specific skills integrate, analyze, aesthetic criteria
▪ Self-reporting ▪ Criterion referenced evaluate ▪ Producing creative
464
Tests ▪ Think divergently products
▪ Observations of value
systems in action
▪ Unobtrusive data
collection
Figure 6. Teaching, Learning, Instructional, and Assessment Strategies (Hanson & Silver, 1996, p. 15)
When combining these strategies and assessment procedures with multiple intelligences, their
unification serves as a compass for observing and directing strategies aimed toward the design and
development of instruction and training. The integration minimizes any of the individual models’
limitations and enhances their strengths. Bloom’s taxonomy serves as the structure and quality control
mechanism for directing and determining the depth of the acquisition of content and skills via different
levels of an individual’s, cognitive, affective, and psychomotor domains. When we focus on process, the
multiple intelligences theory adds to the mix focus on the context of learning and its relation to the
individual. Learning styles emphasis on the individual learning process and Gardner’s content-oriented
model are complimentary. Without multiple intelligences, learning style is rather abstract. Without the
learning styles, multiple intelligences theory proves unable to describe the different processes of thought
and attitude. By adding Bloom’s taxonomy, we are able to observe the levels of thought and direct the
instruction to the development of higher order thinking, attitudes, performance, and design activities to
preferences and intelligences. The outcomes are greater participation and better learning and training
results.
How do we put the models together and utilize them as one framework?
Figure 7. Verbal-Linguistic integrated assessment menu. (Silver, Strong, & Perini, 2000)
Figure 7 indicates four personality types based on a combination of Jung’s four functions: mastery
or sensing-thinking, interpersonal or sensing-feeling, understanding or intuitive-thinking, and self-
expressive or intuitive feeling (See Figures 2 and 3). For an example, using the multiple intelligence of
verbal-linguistic menu, each personality type is matched to the kind of career an individual might choose
based on the personality and the intelligence type. Beside each personality type square in Figure 7, a circle
is drawn which contains an example of behavior or product that might be exhibited. Notice the verb
beginning each observed behavior comes from the verbs suggested by Bloom’s taxonomy (Figure 4) for
465
describing desired behaviors and expectations. In addition, by looking at Figure 6, we can determine the
characteristics of learners, teachers, curriculum objectives, emphasis of learning environments, suggested
instructional, teaching, and assessment strategies and procedures best suited to a personality type, and
desired outcomes. Figure 5 also provides a visual representation and description of eight of the multiple
intelligences indicating strengths, likes, best way to learn, and examples of famous individuals having a
particular type of intelligence. Using the information provided, we can also match the type of intelligence
to a possible personality type by seeing the types of products and activities that are preferred and produced.
Figure 8 offers another example of the type of activities preferred by different intelligences which are
similar to the activities found in Figure 6.
Figure 8 and Figure 9 provide excellent guidance for making decisions concerning the types of
expectations of products and performance desired from instructional and training activities. The designer
can also see where the various preferences overlap in regard to preferred activities and products and choose
ones which will be most meaningful serve the diversity of individuals being served. This knowledge is
extremely valuable and eliminates much of the guess work which often accompanies design and develop of
instruction, materials, and activities.
466
Design models are tools of organization and structure illustrating how the designer goes about his
or her plan for successful training of individuals or creation of products. Instructional designers need to
concentrate on the factors that affect individuals; reflect the way our clients learn; utilize and appreciate
diversity; and combine individual strengths to bring about a stronger, more functional, and dynamic
organization. By integrating these major theories, we can utilize Jung’s personality types, identify learning
behaviors by style, examine curriculum and learning environments, and learn words and products that
indicated the type and level of cognition, attitudes, and the required physical skills.
The implications for the instructional designer are that the instructional design used should
function as a process of thinking that systematically, broadly, and reflectively centers on the end users and
the diversity of their needs. The effectiveness of instructional design corresponds to whether or not the
designer meets and accomplishes the goals of the stakeholders and end users. It seems to be common sense
to validate the individual approaches to learning by being familiar with and knowledgeable of learning
styles, models of intelligence, and personality types.
Equally important for the designer is the need to examine his or her own preferences. Often times,
we may unconsciously develop training and products based on our own preferences. Examining ourselves
to determine our own styles and preferences helps us to realize that we must be more accommodating to
clients by developing diversified training, materials, and products to reach the greatest number of users
possible. We will be more effective, efficient, and successful in reaching the goals and expectations
established. Also by being aware of preferences, through our training and development of materials and
products, we will strengthen and cultivate the development of skills and acquisition of knowledge for our
clients and for ourselves by providing well-designed instructional experiences.
467
References
Armstrong, T. (2000). Multiple intelligences. Retrieved September 20, 2000, from
http://www.thomasarmstrong.com/multiple_intelligences.htm
Chapman, A. (2005). Bloom’s taxonomy – learning domains. Retrieved July 6, 2005, from
http://www.businessballs.com/bloomstaxonomyoflearningdomains.htm
Clark, D. (2001). Bloom’s taxonomy. Retrieved October 3, 2005, from
http://wwwnwlink,cio/~conclark/hrd/bloom.html
Dickinson, D. (2000). Learning through many kinds of intelligence. Retrieve October 2, 2000, from
http://www.newhorizons.org
Drucker, P. (1999). Management challenges for the 21st century. New York: HarperBusiness.
Hanson, J. R., & Silver, H.F. (1996). Learning styles and strategies. (3rd ed.). Trenton, NJ: The Thoughtful
Education Press.
Krathwohl, D. R., Bloom, B. S., & Masia, B. B. (1964). Taxonomy of educational objectives: The
classification of educational goals. New York: David McKay Co., Inc.
Patterson, C. (2002, July). Understanding the multiple approach to learning. In Linking Research to
Educational Practice II Symposium. Retrieved October 1, 2005, from
http://www.ucalgary.ca/~distance/cll_institute/papers.html
Shields, C. J. (1993). Learning styles: Where jung, the beatles, and schools intersect. Curriculum Review.
33(2), 9-13.
Silver, H., Strong, R., & Perini, J. (1997). Integrating learning styles and multiple intelligences.
Educational Leadership. 55(1), 22-27.
Silver, H., Strong, R., & Perini, J., (2000). So each may learn: Integrating learning styles and multiple
intelligences. Alexandria, Virginia: Association for Supervision and Curriculum Development.
Silverstein, S. (1974). Where the sidewalk ends. New York: HarperCollins Publisher.
Silver Strong & Thoughtful Education Press. (2004). What are the four learning styles? Retrieved October
11, 2005, from http://www.thoughfuled.com/lsis/intro.php
468
Tablet PC Initiative: Impact on Students and Learning
Douglas C. Williams
Denise Benton
University of Louisiana at Lafayette
Susan Pedersen
Texas A&M University
Introduction
Reformers have urged educators to make schooling relevant to students through emphasis on
learning through experience and making connections to the world outside of the school walls (Dewey,
1938). Popularization of current learning theories, such as constructivism (Duffy & Jonassen, 1992) and
situated cognition (Brown, Collins, & Duguid, 1989), further emphasize the value of providing
opportunities for students to engage in authentic activities (i.e. critical thinking, reflection, problem
solving) in authentic contexts (i.e. real world situations) using the same types tools (e.g. computers) that
experts use.
Technology literacy skills have changed over the years. More jobs than ever require the use of
information technology to engage in problem solving and critical thinking. During the 1960s, skills
emphasized grammar, typing, accounting, and shorthand. In the 1980s, computer literacy shifted to
competencies in word processing, databases, and spreadsheets. Today, the workplace requires skills in
online communication/collaboration, digital media creation (i.e. animations, graphics, video, audio), and
simulation tools for the purpose of information analysis, problem solving, and critical thinking
(International Society for Technology in Education [ISTE], 2002).
Emerging technologies, such as Tablet PC computers, provide unique capabilities for enhancing
learning in the secondary classroom. Though little research exists on using Tablet PCs to enhance learning,
many studies on using laptops to enhance K-12 and higher educational settings highlight potential benefits.
Lowther, Ross, and Morrison (2003) found that providing fifth, sixth, and seventh grade students 24-hour
access to laptop computers resulted in (a) more frequent and independent use of the computer, (b)
significantly higher scores on writing assessments, and (c) significantly higher scores on a problem-solving
task. An evaluation of the Maine Learning Technology Initiative, which provided 7th and 8th grade
students a laptop, showed that student engagement and attendance has increased and classroom interactions
have become less didactic and more constructivist in nature (Silvernail & Harris, 2003). Muir, Knezek, and
Christensen (2004) found that 8th grade students performed significantly higher in science, math, and
visual/performing arts.
These studies indicate that, when implemented appropriately, portable computing devices have the
potential to enhance learning when used in conjunction with changes in teaching practices. It is clear from
research and best practices that availability of technology is not sufficient to enhance learning. Changes in
the way teachers teach and students learn are essential for improving learning outcomes.
Design of Study
Project Context
The project was carried out in a rural high school with an enrollment of 2,196. Approximately
70% of the students were white, 23% African American, and 4% Latino. All teachers were certified in
their respective teaching areas. The school had won numerous awards for its innovative approaches to
teaching and technology integration. For example, in 2004, the school was selected as a Twenty-First
Century School of Distinction in the “Best of the Best” category. All classrooms are technology rich, with
interactive boards, projectors, big screen monitors, and educational software. In addition to traditional
computer labs, the school has 12 mobile wireless labs.
Sample Selection
A list was generated of 9th grade students who scored in the 76th national percentile on the Criterion
Referenced Competency Test (CRCT). Since the resulting sample was insufficient in size, the selection
window was broadened to include students who scored between 66th and 86th percentiles. Next, the
469
following students were excluded:
• Those who were being served in the gifted program. (Their schedules would preclude them from
meeting scheduling needs.)
• Those who were taking geometry as their math course
• Those who were scheduled to take band 1st through 4th period. (These are audition only classes
and are only offered once per day. Students would have to drop band to participate.)
The result was a list of 70 students. Next, every 5th student was classified as an alternate resulting in 56
participants and 14 alternates. Students on the participant list were identified as belonging to Group A or
Group B (e.g. The first student was assigned to Group A and the second to Group B and the third to Group
A). Finally, the treatment group was randomly selected by the flip of a coin resulting in Group A as the
experimental group and Group B the control group. If parents did not agree that their child participate, an
alternate was chosen, in sequential order, from the alternate list.
Intervention
In order to explore the impact of Tablet PC technology on learning, students were selected to
participate in the research project and randomly assigned to either the treatment group (students given a
Tablet PCs) or the control group. As an incentive to participate, a microprocessor vendor agreed to give
the top performing student in the treatment group and the control group a new laptop at the end of the
project. Teachers and students were provided training on the technical operation of the Tablet PCs.
Minimal pedagogical or technology integration training was provided.
Research Questions
1) Do students provided with Tablet PCs demonstrate higher levels of achievement?
2) Do students provided with a Tablet PC have significantly better computer skills, more often use
the computer as a tool, have better attitudes toward computers in education?
3) Does increased access to technology result in higher self-efficacy?
Each student completed the ACT ASSET before the project started and at the conclusion of the
project. Each testing session resulted in three scores (i.e. writing, numerical, reading). For each score (i.e.
writing score, numerical score, reading score) a simple analysis of covariance (ANCOVA) was calculated
with the post ASSET score as the dependent variable, group (Tablet PC or control) as the between-subjects
independent variable, and the pre-ASSET score as the covariate. Independent two sample t-tests were
calculated on final grades in the subjects of algebra, biology, civics, and English.
In order to answer evaluation question 2, “Do students provided with a Tablet PC have
significantly better computer skills, more often use the computer as a tool, and have better attitudes toward
computers in education?”, two data sources were used: Student Survey and focus group interviews.
The Student Survey is a self-report instrument based on one utilized in the Maine Learning
Technology Initiative (Muir, Knezek, and Christensen, 2004). For the sake of clarity, two versions of the
instrument were created. The surveys differ in language (i.e. “computer” was used in the control group
version and “Tablet PC” for the treatment condition version) and items that were not relevant to students in
the control group, (e.g. Do you bring your Tablet PC home?) were eliminated from the control group’s
470
survey. The survey consists of the following sections:
• Computer Skills: Rate skill level (i.e. never used, beginner, intermediate, advanced, don’t know)
in using computer software (e.g. word processing, email, spreadsheet, simulation software).
• Computer as a Tool: Rate frequency (i.e. never used, less than monthly, one or more times per
month, one or more times per week, every day or almost every day) on computer related tasks
(e.g. finding information, organizing information, taking notes).
• Attitude Toward Computers in Education: Gauge students’ beliefs about the use of computers for
teaching and learning (e.g. Computers make schoolwork more fun/interesting, I believe that the
more often teachers use computers to teach, the more I will enjoy school).
In order to analyze the results of the Student Survey, average scores for each item were calculated
for each group. Next, nonparametric Mann-Whitney U tests were performed to identify significant
differences in scores between the Tablet PC group and the control group.
Focus group interviews were conducted with the teachers and students involved in the project.
Interviews were transcribed and then coded independently by two researchers. A starting list of themes
was generated from the research questions. Statements were then coded and nested under an existing
theme or a new theme was created when no existing themes were appropriate. Once all statements were
grouped in this manner, statements within each theme were grouped into subcategories. Patterns and
relationships between themes and subcategories throughout the data were examined in order to note
interesting findings. Disagreements between the researchers were discussed until full agreement was
reached.
In order to answer evaluation question 3, “Does increased access to technology result in higher
self-efficacy?”, average scores for each item on the Measure of Academic Self-Efficacy were calculated for
each group (i.e. Tablet PC and control). Next, nonparametric Mann-Whitney U tests were performed to
identify significant differences in scores between the two groups of students. Note that the nonparametric
Mann-Whitney U test is appropriate for ordinal data.
The Measure of Academic Self-Efficacy assesses students’ academic self-efficacy and their
attitudes toward learning in three subject areas: science, mathematics, and English. Self-efficacy refers to
an individual’s beliefs about his/her ability to learn or perform tasks (Bandura, 1986). Previous research
suggests that self-efficacy influences achievement (Schunk, 1995).
Findings
Achievement: ACT ASSET
Analysis of covariance (ANCOVA) results indicate that there was a grouping effect with regards
to the writing skills test of the ASSET (p = 0.010), with the control group having a significantly higher
mean post-writing score. That is, the control group performed significantly better than the Tablet PC group
on the ASSET measure of writing skill. ANCOVA results revealed that there was no group effect on
posttest reading skills (p = 0.641) and posttest numerical skills (p=0.146) scores. Table 1 provides a
summary of the analysis for writing skills.
Random assignment of participants, in particular with large sample size, should result in
equivalent groups. In this case, the control group had slightly higher scores on ASSET prior to the start of
the study, though the difference was not significant. In the chance that the groups may have been
significantly different prior to the start of the study, we chose to utilize an ANCOVA since it corrects for
pre-study differences between the groups, should they exist. In other words, the ANCOVA procedure
eliminates the effect of differences between the treatment and control groups prior to the start of the study.
471
GROUP 71.267 1 71.267 7.214 .010 7.214 .748
Error 434.690 44 9.879
Total 78476.000 47
Corrected 1019.319 46
Total
Table 5: Comparision of Tablet PC Group and Control Group on Attitude Toward Computers in Education
(Section IV of Student Survey)
Statement Tablet PC Average Control Mann-
Average Whitney p-
value
472
I prefer to use a computer to do my schoolwork. 2.0 4.89 .001
Computers make schoolwork more
2.0 4.89 .007
fun/interesting
I believe that the more often teachers use
2.13 4.78 .001
computers to teach, the more I will enjoy school.
I believe that it is very important for me to learn
2.13 5.56 <.001
how to use a computer.
Computers make schoolwork easier to do. 2.00 5.00 .001
Computers help me improve the quality of my
1.93 4.89 .001
schoolwork.
Interview Data
Coding of interview data resulted in four major themes: need for better training, technical issues,
teaching with Tablet PCs, impact on personal organization.
Many statements by teachers, technical support staff, and students indicate that the lack of
sufficient training was a major hindrance to the project. Students and teachers were provided minimal
training on the use of the Tablet PC. Teachers indicated they were told 3-days prior to the start of the
school year that they would participate giving them insufficient time to prepare lessons to leverage the
technology. Little training was provided on how the equipment may be used to enhance teaching and
learning. Some support was provided in monthly meetings with the Tablet PC teachers. During these
meetings, Instructional resources were provided (e.g. web site resources, web quests) and teachers had an
opportunity to share what they were doing with the Tablet PCs in their classroom. As the project
progressed, these meetings resulted in being a time for teachers to vent frustrations. Teachers felt they had
to spend valuable instructional time teaching children software applications such as Microsoft PowerPoint.
All students, teachers, and technical support staff found the equipment reliability to be a major
problem in the study. By the forth month of the project, all were experiencing significant levels of
frustration. It was common to have 20% of a class without a working Tablet PC. Technical support staff
reported that 1-2 computers per week were sent for repair
Teachers noted that the addition of the Tablets created a “classroom management nightmare”. The
poor battery life required charging during the school day. Students required time to take out the Tablets,
find a power outlet, start the computer, and shutdown the computers at the end of the period. These tasks
473
often took up 7-10 minutes of instructional time each period. Additional time was lost dealing with
technical problems. All of the teachers noted that Instant Messenger, games, and surfing the Internet
distracted students. One student stated, “I think the control group’s grades are so much better because they
do not have so many distractions as we did during the class.” Teachers noted an increase in plagiarized
assignments since students were creating work on the Tablets and sharing with each other.
Teachers and students noted that problems arose since no clear set of ground rules for appropriate
use were articulated prior to the start of the study. Teachers felt that everyday “some new issue would
surface” making it difficult to provide a fair and consistent enforcement of appropriate use policies.”
Tablet students stated “…they did not tell us the policies beforehand so some students got into trouble
[unfairly]”.
On a positive note, all teachers interviewed believed that, if the technical problems were resolved
and the project was well planned, Tablet PCs could have a positive impact on teaching. One teacher noted
the following, “The potential is amazing. For their record keeping, Outlook has a daily planner. The
calendar can be synchronized with the teachers to make sure deadlines are clear. Buddy lists can be used
so students know who to contact about homework. And providing a PowerPoint for kids to use when
taking notes…”.
All students felt that, if working properly, the Tablet PCs would be beneficial in helping them stay
organized. With the many different assignments, notes, and papers from each class, students felt the
storage capacity for documents, note-taking tools, email, and calendar features would help them become
better students.
474
addressed through use of Tablet PCs. Adopting a model, such as the iNtegrating Technology for inQuiry
(NTeQ) model (Morrison & Lowther, 2002) would provide a framework for teachers as they develop
student centered learning activities.
References
Bandura, A. (1986). Social foundations of thought and action: A social cognitive theory. Englewood Cliffs,
NJ: Prentice Hall.
Barrows, H. S., & Tamblyn, R. M. (1980). Problem-based learning: An approach to medical education.
New York: Springer Publishing Company, Inc.
Brown, J. S., Collins, A., & Duguid, P. (1989). Situated cognition and the culture of learning. Educational
Researcher, 18(1), 32-42.
Dewey, J. (1938). Experience & education. New York: Macmillian Publishing Company.
Duffy, T. M., & Jonassen, D. H. (1992). Constructivism and the technology of instruction. Hillsdale, New
Jersey: Lawrence Erlbaum Associates.
Hannafin, M., Land, S. M., & Oliver, K. (1999). Open learning environments: Foundations, methods, and
models. In C. M. Reigeluth (Ed.), Instructional design theories and models: Vol. 2. (pp. 115-140).
Mahwah, NJ: Lawrence Erlbaum Associates.
475
International Society for Technology in Education. (2002). National educational technology
standards. Available on the web: http://cnets.iste.org/
Lowther, Ross, and Morrison (2003). When each one has one: The influences on teaching strategies and
student achievement of using laptops in the classoom. Educational Technology Resarch and
Development, 51(3), 23-44.
Morrison, G.R., & Lowther, D.L. (2002). Integrating computer technology into the classroom (2nd ed.).
NJ: Upper Saddle River, Merrill Prentice Hall.
Muir, M.; Knezek, G.; and Christensen, R. (2004). The Maine Learning Technology Initiative: An
exploratory study of the impact of ubiquitous technology on student achievement. Available at
www.mcmel.org/MLLS.
Posavac, E. J., & Carey, R. G. (2003). Program evaluation: methods and case studies (6th ed.). Upper
Saddle River, NJ: Prentice Hall
Schunk, D. H. (1995). Self-efficacy and education and instruction. In J. E. Maddux (Ed.), Self-efficacy,
adaptation, and adjustment: Theory, research, and application (pp. 281-303). New York: Plenum
Press.
Silvernail, D. L.; and Harris W. J. (2003). The Maine Learning Technology Initiative: Teacher, student, and
school perspectives mid-year evaluation report. Maine Education Policy Research Institute.
Available at www.usm.maine.edu/cepare/
476
Understanding Computer Anxiety and Computer Related Experience:
The Model and Practices
Computer Anxiety
Most researchers define computer anxiety as an emotional response of apprehension or fear of
computer technology "accompanied by feeling of nervousness, intimidation, and hostility. Negative
cognition and attitudes towards computers may also accompany such feeling of anxiety and include worries
about embarrassment, looking foolish, or even damaging computer equipment" (McInerney, McInerney, &
Sinclair, 1994, p.28). Gardner, Discenza and Dukes (1993) provided a similar definition, and empirically
compared four scales of computer attitudes, concluding that any of the four measures provide reliable,
reasonably valid information.
An array of theoretical stances towards computer anxiety appear in the literature, some of which
are rooted in a social learning model, and others which are rooted in more clinical models (McInerney,
McInerney, & Sinclair, 1994). The way one views computer anxiety shapes the way one seeks to address it.
For example, if computer anxiety is seen as a social learning phenomenon, one might seek to address it by
"enhancing self efficacy through skill building and success experiences" (McInerney, McInerney, &
Sinclair, 1994, p.29), whereas if it is conceived as an intra-individual construct, clinical approaches
including desensitization, relaxation, or counseling might be used. Leso and Peck (1992) differentiated
between trait (stable, individualistic) anxiety and state (changeable, situational) anxiety, noting that
computer anxiety has been typically regarded as state anxiety. Researchers have long been concerned with
measuring the levels of computer anxiety among teachers and students toward better technology
integration. Self-report Likert-type instruments such as the Computer Anxiety Scale (CAS) have been
continually developed and examined (George, Stocker, & Marcoulides, 2004)
477
Weil, Rosen and Sears (1987) argued "during repeated exposure to the computer, the computer-phobic is
being reconditioned at increased levels of anxiety which, in turn, increases discomfort and anxiety" (p.180).
Exploring this complex relationship has led researchers to examine quality of the experiences with
computers, suggesting that certain types of computer experiences (e.g. applications vs. programming),
environments (e.g. lab vs. lecture setting), and teaching strategies (e.g. responsive, hands-on, relevant) are
related to changes in computer anxiety over time (Gardner, Discenza, & Dukes, 1993; Leso and Peck,
1992). Negative experiences with computer are likely to increase anxiety, whereas positive experiences are
likely to reduce them.
Leso and Peck (1992) compared students in introductory programming and applications courses
and found that though their anxiety levels were similar when they began, reduction of student anxiety was
greater in the tool’s course than in the programming course. It may be that by presenting relevant
applications that are relatively easy to use, perceived self-competence (a negative correlate of computer
anxiety) will increase and anxiety will decrease. Beginning word processing application, graphics,
spreadsheet, database, and Internet explorations prior to introducing programming may decrease computer
anxiety. They also found however, that large numbers of subjects in both courses experienced no reduction
of computer anxiety as a result of their experience.
In summary, computer anxiety is related to experience but not simply related. It is not merely a
function of time, nor of particular types of experiences. There are intra-individual variations in instructional
contexts and responses to these contexts that may be a function of sociocultural variables and prior
experiences. Given this, we must consider a more complex approach for reducing computer anxiety.
L(t)
l5
l4
l3
l2
l1
t
0 t1 t2 t3 t4
Figure 1. Ideal function of computer knowledge level
478
Process Two: Time, Computer Anxiety, and Computer Knowledge
The level of computer anxiety fluctuates over time depending on the content (computer
knowledge). Computer anxiety (CA) as a function of time (t) is defined as:
ti - t
CA(t) = Ki + Mi for t in [ti-1, ti)
ti - ti-1
Here Ki and Mi are the computer anxiety scale factor and minimum computer anxiety level in interval [ti-1,
ti), respectively. Therefore, the graph of computer anxiety as function of time can be represented in Figure
2.
CA(t)
m4
m2
m3
m1
m5
t
0 t1 t2 t3 t4
In the Figure 2, it has been assumed that computer anxiety scale factor (Ki) is constant in each
interval except the last interval [t4, ∞), where computer anxiety (CA(t)) approaches minimum computer
anxiety (M5) infinitely. It is also assumed that the minimum computer anxiety (Mi) is different in every
interval with no particular pattern. However, the actual computer anxiety scale factor (Ki) is very likely to
be non-constant in any of the intervals and the minimum level of computer anxiety (Mi) may vary with
certain patterns instead of what is presented in the above graph. In reality, computer anxiety over time
(CA(t)) will most likely to be continuous, since the level of computer knowledge (L(t)) is probably
continuous, and computer anxiety over time (CA(t)) should decrease differently in each interval.
Proposed Model
Computer anxiety is a function of time and computer knowledge BUT fluctuates over time and
knowledge level. Figure 3 depicts the connection between Process One (the relationship between time and
computer knowledge) and Process Two (the fluctuation of computer anxiety across levels of computer
knowledge).
CA(t)
t
0 t1 t2 t3 t4
479
Implications
Our model illuminates several implications in education.
1. There are levels of computer knowledge (application, advanced software, programming, etc.) and
anxiety is likely to increase each time learners move to a new level. This pattern can also be seen
within the same level or interval of computer knowledge/skills (for example, at the application
stage, computer anxiety levels may fluctuate from word processing to spreadsheet, or from
desktop publishing to database).
2. Effective strategies are likely to minimize computer anxiety, regardless of background, at each
level of computer knowledge. These include hands-on learning, relevance to learners' interest,
opportunities for feedback, supportive and caring instruction, and active learning where students
work on their own projects and see the application to their area of study. The transition across
different levels of computer knowledge should be as gradual and painless as possible.
3. For most pre-service and in-service teachers, programming may not be a major concern. Their
computer experience may not move past the application stage. There is nothing wrong with that,
since the application level is where most computer use or integration occurs in teaching and
learning. The National Educational Technology Standards for Teachers (NETS.T) prepared by the
International Society for Technology in Education (ISTE) have attested this aspect. These
standards emphasize technology integration, of which personal and professional use of
productivity tools in communicating, collaborating, conducting, researching, and solving problems
has been highlighted (ISTE, 2000).
4. Sociocultural factors must be considered in planning instruction. Because of early lack of
opportunity and sociocultural expectations, some learners will need more one-on-one support,
more time, and different experiences tailored to their background. For example, research on girls
indicates that they favor a collaborative (not competitive nor individualistic) learning environment,
and are more engaged in using computers as tools with an eye toward community service (rather
than as toys for engaging in competitive games).
5. A rich conception of "computer anxiety", which locates the "problem" within the context, rather
than within the learner’s demographic variables, is likely to serve all learners better, ensuring
increased competence and decreased computer anxiety. How we define computer anxiety has
implications on how we teach and on who learns and how learners learn.
Practices
Following our computer anxiety model, four related studies have been conducted at State
University of New York at Oswego. The conceptualization and implications of the proposed computer
model have held steady from these studies.
480
applied technology training workshop. Results showed that there was no statistically significant difference
related to computer attitudes between workshop and control groups. However, a marginal difference (p <
.074) related to computer anxiety between workshop group and control group was evident.
The finding of this study confirmed that computer anxiety might be more directly related to what
kind of computer knowledge learners are learning rather than how much time learners are spending.
Study Two: Minds on, Hands on – The Linear-Nonlinear Approach to a Multimedia and Internet
Course
This study (Yang, Shindler, & Keen, 2000) examined the adaptation of a combination of both
linear and nonlinear strategies implemented into an applied technology course in summer 1999. A linear
approach was characterized by a direct, sequential and outcome-driven strategy. A nonlinear approach is
characterized by an indirect, random and process-driven strategy (Forcier, 1999). The class was limited in
size (16 students, 4 males and 12 females) and met for 3 hours twice a week for 6 weeks. The course
covered three major topics:
1. Use of the major Internet tools for K-12 teaching and research
2. Design and development of multimedia projects
3. Design and development of basic educational web projects
To initiate each of three new topics, a problem was introduced to the student in the form of a case
study. Both instructor and students collaborated to analyze the problem, seek the solutions, apply related
computer technologies, evaluate the final product, and discuss possibilities for integrating new technologies
into real-world problems/projects.
With each topic, after an outcome-driven grounding using a direct linear problem-solving
approach, student started for a transition to a nonlinear collaborative-inquire approach. The nonlinear
approach allowed students the room to determine their own path to goal attainment without having a
hierarchical structure or predetermined outcome imposed on them. This approach was to let students
operate in a flexible environment that would be more comfortable for more random thinkers and
challenging and exciting to the more concrete and sequential thinkers. In addition, this approach had the
added factor of being motivational, given that students selected their own direction and projects.
The results of this study suggest that using a purposeful combination of both linear and nonlinear
strategies within a problem-based approach provides students with dimension of learning that neither one
alone can achieve. To reduce students’ computer anxiety and enhance technology integration, computer-
based courses/programs should be relevant to students’ interests and learning styles, and incorporate an
instructional model that employs a cognitive developmental framework most suited to the needs of the
learners.
Study Three: Mission Possible: Project-Based Learning Preparing Graduate Students for
Technology
This study (Yang, 2001) examined the adaptation of project-based learning principles into a
applied technology course in summer 2000. The class was restricted only to graduate students and was
limited in size (17 students, 9 males and 8 females). Students and instructor met 3 hours twice a week for 6
weeks at computer-enhanced classroom.
Previous researches have shown that project-based learning can capture the complexities of real
life situations. Not only does it provide an effective way for students understanding the connection of
knowledge to the context of its application, but it also provides students with opportunities for self-
reflection and a sense of agency. Barron and the Cognition and Technology Group at Vanderbilt (1998)
have identified four major design principles that appeared to be very important for project-based learning:
a). defining learning-appropriate goals that lead to deep understanding; b). providing scaffolds such as
beginning with problem-based learning activities before completing projects; c). including multiple
opportunities for formative self-assessment; d). developing social structures that promote participation and
a sense of agency (p. 306). Following these four principles, the course structured three projects:
1. Webliography - to understand how to search, evaluate, and organize educational Internet resources
2. WebQuest – to use educational Internet resources creating the inquiry-based learning activities
3. Electronic Course Portfolio – to foster self-assessment, reflection, and analysis of their learning on
the course
The findings of this study indicate three major positive effects: 1). the usefulness of extended
learning. Students reported that interrelated learning-appropriated goals, authentic projects, and interactive
481
learning atmosphere made them emerging as active, engaged learners; 2). the effectiveness of production.
Students reported that working on their own real and related projects made their understanding deeper than
simply “doing” without “understanding”; 3). the proficiency of technology integration. Students reported
that by experiencing project-based learning, they had better idea on how to locate, evaluate, and use
information and technology in their classrooms.
Study Four: STEP on Developing Active Learning Community for an Online Course
While the asynchronous distance learning programs are expanding, and the participants are
mounting, the question of how best to foster community among learners to learners and learners to
instructors who are physically and timely separated from each other has been raised (Rovai, 2002; Palloff
and Pratt, 1999). Such separation may increase social insecurities, anxieties of communication and
computer related technology, and feelings of disconnectedness (Jonassen, 2000; Kerka, 1996), as a result,
“the student become autonomous and isolated, procrastinate, and eventually drops out” (Sherry, 1996).
In order to meet such challenge, this study (Yang & Maina, 2004) examined how a sound practical
approach was designed and then implemented in one on-line course, including scaffolds before initiating
class and starting new learning topics; transitions during the learning process in order to avoid the lack of
personal touches and no-verbal cues; evaluations during and after each learning topic, and presentations on
outcomes through the website (STEP). Since spring 2001, this course was joined State University of New
York Learning Network (SLN) as one of asynchronous learning network (ALN) courses offering for both
on and off campus students. The class has been restricted only to graduate students and been limited in size
(n=20) for each section.
The results of this study indicate that the systematic approach with a variety of strategies such as
the STEP is effectively related to establish the active learning community for ALN courses. It confirmed
that active learning community which relates to interactivity, sense of well-being, quality of the learning
experience, and effective learning is essential for successful online courses (Rovai, 2002; Rourke,
Anderson, Garrison, and Archer, 2001).
References
Barron, B. J S., Schwartz, D. L., Vye, N. J., Moore, A., Petrosino, A., Zech, L., Bransford, J. D., and The
Cognition and Technology Group at Vanderbilt. (1998). Doing with understanding: Lessons from
research on problem- and project-based learning. The Journal of the Learning Sciences, 7(3&4),
271-311.
Bohlin, R.M., & Hunt, N.P. (1995). Course structure effects on students computer anxiety, confidence and
attitudes. Journal of Educational Computing Research, 13(3), 263-270.
Chen, M. (1986). Gender and computers: The beneficial effects of experience and attitudes. Journal of
Educational Computing Research, 2(3), 265-282.
Forcier, R. C. (1999). The computer as an educational tool: Productivity and problem solving (2nd ed.).
Upper Saddle River, NJ: Prentice-Hall, Inc.
Gardner, D.G., Discenza, R., & Dukes, R.L. (1993). The measurement of computer attitudes: An empirical
comparison of available scales. Journal of Educational Computer Research, 9(4), 487-507.
George, A., Stocker, Y.O, & Marcoulides, L. D. (2004). Examining the psychological impact of computer
technology: An updated cross-cultural study, Educational and Psychological Measurement, 64(2),
311-318.
Gos, Michael W. (1996). Computer anxiety and computer experience: A new look at an old relationship.
The Clearing House, 69(5), 271-276.
Hadfield, O.D., Maddux, C.D., & Love, G.D. (1997). Critical thinking ability and prior experience as
predictors of reduced computer aversion. Computers in the Schools, 13(3-4), 13-29.
Heinssen, R.K., Glass, C.R., & Knight, L.A. (1987). Assessing computer anxiety: Development and
validation of the computer anxiety scale. Computers in Human Behavior, 3(1), 49-59.
Howard, G.S. & Smith, R. (1986). Computer anxiety in management: Myth or reality? Communications of
the ACM, 29, 611-615.
International Society for Technology in Education. (2000, June). National educational technology
standards for teachers, Eugene, OR: ISTE.
Jonassen, D. H. (2000). Computers as mind tools for schools: Engaging critical thinking (2nd ed.). Upper
Saddle River, New Jersey: Merrill.
482
Kerka, S. Distance learning, the Internet, and the World Wide Web. (ERIC Document Reproduction
Service No. ED 395 214, 1996).
Leso, T. & Peck, K.L. (1992). Computer anxiety and different types of computer courses. Journal of
Educational Computer Research, 8(4), 469-478.
Loyd, B. H. & Gressard, C. (1984). The effects of sex, age, and computer experience on computer attitude.
AEDS Journal, 18(4), 67-76.
Maurer, M.M. (1994). Computer anxiety correlates and what they tell us: A literature review. Computer in
Human Behavior, 10(3), 369-376.
Mahmood, M.A., & Medewitz, J.N. (1989). Assessing the effect of computer literacy on subject’s attitudes,
values, and opinions toward information technology: An exploratory longitudinal investigation
using the linear structural relations (LISREL) model. Journal of Computer Based Instruction,
16(10), 20-28.
McInerney, V., McInerney, D.M., Sinclair, K.E. (1994). Student teachers, computer anxiety and computer
experience. Journal of Educational Computing Research, 7(1), 27-50.
Palloff, R. M. and Pratt, K. (1999). Building learning communities in cyberspace. San Francisco: Jossey-
Bass Publishers.
Rovai, A. Alfred, (2002). A preliminary look at the structural differences of higher education classroom
communities in traditional an ALN courses. Journal of Asynchronous Learning Networks, 6(1),
41-56.
Sherry, L. (1996). Issues in distance learning. International Journal of Educational Telecommunications,
1(4), 337-365.
Reed, W.M., Ervin, J.R., & Oughton, J.M. (1995). Computers and elementary education students: A ten-
year analysis. Journal of Computing in Childhood Education, 6(1), 5-24.
Rosen, L., Sear, D., & Weil, M. (1987). Computerphobia. Behavior Research Methods, Instruments and
Computers, 19(2), 167-179.
Weil, M., Rosen, L., & Sear, D. (1987). The computerphobia reduction program: Year 1. program
development and preliminary results. Behavior Research Methods, Instruments, and Computer,
19(2), 180-184.
Yang, H.; Mohamed, D.; and Beyerbach, B. (1999).An investigation of the computer anxiety among
vocational-technical teachers. Journal of Industrial Teacher Education, 37(1), 64-82.
Yang, H.; and Shindler, J. (2000). Applied instructional technology for student teachers. In Annuals of the
Association for the Advancement of Educational Research and National Academy of Educational
Research 1999. 169-173. Lanham, ML: University Press of America.
Yang, H., Shindler, J., & Keen, A. (2000). Minds On, Hands On: The Linear-Nonlinear Problem-Solving
Approach to a Multimedia and Internet Course. In Proceedings of Society for Information
Technology and Teacher Education International Conference 2000 (pp. 738-743). Norfolk, VA:
AACE.
Yang, H. (2001). Mission Possible: Project-Based Learning Preparing Graduate Students for Technology.
In Proceedings of Society for Information Technology and Teacher Education International
Conference 2001 (pp. 2855-2857). Norfolk, VA: AACE.
Yang, H., & Maina, F. (2004). STEP on Developing Active Learning Community for an Online Course. In
Proceedings of Society for Information Technology and Teacher Education International
Conference 2004 (pp. 751-760). Norfolk, VA: AACE.
483
Evaluation of Literacy Log and Discussion Board Postings in Online
Learning
Yuanming Yao
Yedong Tao
Vassiliki Zygouris-Coe
Donna Baumbach
University of Central Florida
The Florida Online Reading Professional Development (FOR-PD) program is funded by the
Florida Department of Education (DOE) and housed at the University of Central Florida (UCF). This staff
development project functions as a primary statewide delivery mechanism for improving teaching methods
in reading instruction in preK-12.
484
instructional design for distance education based on three types of interaction: (a) learner–content
interaction; (b) learner–instructor interaction; and (c) learner–learner interaction.
This qualitative evaluation therefore, would focus on the questions drawn from the RFP by FL
DOE and the FOPR-PD content experts in relation to participants’ learning of content based on Blooms’
learning outcome categories (Bloom & Krathwohl, 1956), including what participants learned from FOR-
PD lessons and what they thought of the content. Another focus of the analysis of the log and the postings
was to evaluate whether and how FOR-PD participants planned to implement or were implementing what
they learned from FOR-PD lessons.
Methods
Data Source
As of April 2004, FOR-PD has in total enrolled 5728 participants statewide from 64 school
districts and 5 participating universities. Altogether, 289 sections have been conducted, including school
district sections and university ‘for-credit’ sections. There was a variation in specific decisions on the
implementation of the literacy log as an assignment made by different participating district sections. Some
districts required participants to complete literacy log assignments and some did not. UCF 004 and 005, as
university graduate or ‘for-credit’ sections, used the log as an ongoing assignment and collected it at the
end of the course.
The major data for this qualitative evaluation of phase two was drawn from the UCF FOR-PD
course sections in fall 2003, as recommended by the FOR-PD content experts, who designed the content,
developed the literacy log strategies and created the participants’ posting assignments. More than a half
number of FOR-PD lessons (8 out of 14) were used for detailed analysis, including Lesson 2, 3, 4, 8, 9, 10,
11 and 12. Besides, Lesson 1 discussion board postings were used for analyzing participants’
demographics. Common to all of the eight selected lessons, the assignments emphasized how participants
were applying or planned to apply their new knowledge from FOR-PD lessons in their own classrooms.
Sampling
Purposive sampling of participants was used for this qualitative evaluation of FOR-PD
participants. All the postings and literacy log strategies submitted by 20 out of 62 total participants were
analyzed, based on the eight selected lessons in the two course sections, including participants with various
roles such as teachers, counselors, pre-service teachers and others. Among the 20 participants, 13 were in-
service preK-12 teacher participants with teaching experience ranging from 1 to more than 10 years. The
subject they taught included literacy/reading and non-literacy/reading content area in exceptional or regular
classrooms. The other seven participants were school counselors, pre-service teachers and the Florida
Literacy and Reading Excellence (FLaRE) coordinators. Specific information of the participants is
depicted by Table 1.
485
with 26 years of teaching experience in Prek-12
15 School counselor
16 Area Coordinator for FLaRE, also reading resource teacher with 15 years of teaching
experience and 3 years of reading curriculum experience
17 Full time graduate student at UCF, majored in varying exceptionalities (VE)
18 Future school guidance counselor
19 Associate Professor at UCF in exceptional education
20 Area Coordinator for FLaRE with life-long experience of teaching
Data Analysis
Robert Yin's pattern-matching, one of its dominant modes of case study analysis was used for the
design for this qualitative evaluation and was applied to analyze and examine the log and participants’
postings. Pattern matching “compare an empirically based pattern with a predicted one (or with several
alternative predictions) for dependent variables” (1994, p.107) . The learning outcome of FOR-PD
participants was therefore matched with the requirements of the RFP from FL DOE and the content
features depicted by the UCF proposal for FOR-PD program. Specifically, the use of literacy log was to
motivate teacher participants to better implement reading/reading instruction strategies in their own
classrooms and the use of postings to develop an e-community for teacher and non-teacher participants for
their individual professional development through interaction with FOR-PD course content and section
facilitators.
Automatic and hand coding were conducted. First, the hand-written literacy log strategies were
first transcribed into electronic files. Each participant’ postings were compiled and downloaded from the
online section into one complete file and then copied to ATLAS-ti, a software for organizing and coding
qualitative data, as an individual primary document. Based on the focus of this evaluation, three categories
addressing the specific questions drawn from the RFP and the content experts were used, including a) what
participants learned from FOR-PD lessons, b) how participants liked FOR-PD lessons, and c) how
participants implemented/planned to implement what they learned from FOR-PD lessons.
Using open coding in ATLAS-ti, the category of what participants learned was first coded
reading/reading instruction strategies, content reading strategies, reading instruction principles, reading
resources, and assignments, focusing on the content of FOR-PD lessons. Meanwhile, another set of codes
based on Bloom’s taxonomy (1956) of learning outcomes in the cognitive domain, i.e., Knowledge,
Comprehension, Application, Analysis, Synthesis, and Evaluation were adopted for this category. Next,
using cross-search for both the codes based on Bloom’s learning outcomes and those indicating the
different aspects of the FOR-PD lessons, quotations from each participant depicting what participants
learned were identified and located. In the same way, quotations from each participant were also produced,
illustrating how participants liked FOR-PD lessons and how participants implemented or planned to
implement what they learned from FOR-PD lessons in terms of reading strategies, instruction principles,
assignments, and resources.
Findings
The analysis of the log and the corresponding participants’ discussion board postings revealed that
participants made significant gains in their understanding and familiarity with the reading strategies,
principles, and resources that they learned from FOR-PD lessons. Participants like the research-based
FOR-PD lessons including its large on-line reading resources and the challenging assignments.
Furthermore, participants were implementing or planned to implement the strategies and techniques that
they learned from FOR-PD in their own classroom instructions.
First, participants have learned or become more familiar with reading strategies, principles and
resources from FOR-PD, achieving all the learning outcomes including the lower-level and the higher level
learning in the cognitive domain (Bloom & Krathwohl, 1956). When participants were required to do their
literacy log or BBS posting assignments, most of them would describe and summarize the reading
principles, strategies, or resources, displaying their knowledge’ and comprehension. Then they would
analyze the instructional situations for proximal application, considering their students' specific
characteristics. Next, participants would synthesize the strategies, principles or resources by combining the
new with the old or just integrate them into the subject content, and at last evaluate how the strategies,
principles or resources could work in different instructional situations. Table 2 consists of juxtaposed
486
quotations selected from the postings and the log, displaying how a participant comprehended and applied
FOR-PD resources.
I have been an elementary teacher for 14 years, and thought I had a What I Know:
pretty good idea of how to teach reading. However, this course has 1) Reading is making sense of
provided me with more questions than answers. It was difficult to written text.
choose three. These links are such good resources I have spent way too 2) Reading is an essential life
much time reading them. skill.
3) Phonics and phonemic
I teach at a Magnet school that focuses on Math, Science and awareness are important
technology, so my first question is #1: how I can use available factors in children learning to
technology to teach reading. In searching through the websites, I have read.
discovered that technology "is a small piece of the pie" in teaching 4) Reading for pleasure is
reading but can be a good reinforcement of skills taught. It is important important.
to have good quality programs. Programs provided to the children 5) Struggling readers need
should have clear instructions to navigate easily. Some programs intensive strategies to correct
provide a record keeping system so that the teacher can monitor each problems.
child's progress. Technology can add unique experiences for children to
interact with literature. What I Wanted to know:
http://www.suite101.com/article.cfm/reading/38568l 1) How important are
http://www.fcoe.k12.ca.us/techprof standardized tests to the
process of learning to read?
2) What is DIBELS?
This year my county replaced our balanced literacy testing with the 3) How can I use technology
DIBELS test. So my next question is #2: what is the DIBELS test? I to teach reading?
found out that the DIBELS is given in K-3 grades three times per school 4) How can we get children to
year. It tests the "big three” of literacy: phonological awareness, read more often?
alphabetic understanding, and automaticity/fluency. DIBELS is time
effective taking 3 minutes per component, per child to score and assess. What I Learned:
According to the article, DIBELS can identify children who are most at 1) We must teach children to
risk for reading difficulty, so that interventions can be planned in the become lifetime readers
early grades. URL instead of school time readers
http://reading.uoregon.edu/big_ideas/trial_bi_index.php 2) DIBELS can help identify
“at risk” reader early so
My third question is #3: how can we get children to read more? I found interventions can be
an article entitled "Why read to Children”. This article bases its research implemented in early grades.
on a study by the Commission on reading which states "Reading aloud to 3) I can use technology to
children is the single most important activity...It is a better teaching tool reinforce skills taught. It can
than anything else in the home or classroom." The article continues to be another medium for
state that we must condition the child's brain to associate reading with children to discover literature.
pleasure so that we teach children to become "lifetime readers" instead
of "school time readers.” Besides reading aloud to children everyday,
the article suggested ways to provide a print rich environment for
children. One suggestion that I really like and plan to use in my
classroom is using rain gutters on the walls to display books with the
covers facing out.
http://www.sdcoe.net/pdop/trec/support/html/prog_4.htm
Second, participants like FOR-PD lessons, including its research-based content, large on-line reading
resources, and motivating and challenging assignments. Most participants, experienced or inexperienced
487
teachers, valued the FOR-PD lessons highly and planned to utilize more strategic instruction in their
classrooms. A few participants claimed that their attitude towards the highly challenging strategies like
SCAMPER (see Appendix 1) became more positive after using them for their FOR-PD assignments or in
their own teaching. Moreover, participants, who were teaching content area also positively perceived the
strategies they were using or would be using from FOR-PD. Following are a few quotations about what
experienced or inexperienced, or what content area or literacy/reading teacher participants thought of FOR-
PD lessons in terms of content, resources and assignments. A reading resource teacher from Hernando
Elementary School in Citrus County, with over 25 years of teaching experience, commented on FOR-PD
lessons: “This lesson has been a wonderful learning experience for me. I've been teaching 25 years and
didn't realize how much I did not know about reading. I also now understand why we are doing the things
we are doing in education and how these things were developed. This made everything very clear for me.
The resources that are available to us are unbelievable! I am sharing this with everyone I know. I also
believe administrators and school board members should be required to take this course. The choices they
make about educating our children should be from an informed perspective and this would provide that”. A
beginning Kindergarten teacher also liked the strategies very much and planned to utilize them in her own
classroom, “I enjoyed using the 6 Hat Strategy! I plan to use this strategy in many different areas. This
strategy is great for organizing my lessons! I think that the SCAMPER strategy is a great way to analyze a
unit or lesson. I have found that many of the things that I thought would work in the classroom have not
gone well. Using this type of strategy will be a great way to organize the entire unit”. An area Coordinator
for the Florida Literacy and Reading Excellence program (FLaRE) and also an 8th grade Language Arts
teacher has become more positive towards the highly challenging strategies and assignments, e.g.,
SCAMPER, after using them for her FOR-PD assignments. She commented, “As I worked with the
(SCAMPER) strategy, I realized that it does have some merit, especially for provoking deeper thinking for
the students. I will try it when I again teach this book...after I completed it, I decided that it was good. I
am sure I will use it and share it with others now that I have it:)”
Third, participants were implementing, or planned to implement, what they learned from FOR-PD
lessons.
FOR-PD teacher participants addressed both instructional situations and instructional
methods/reading strategies, when they discussed how they implemented or planned to implement strategies
in their teaching. Situations consisted of their students’ characteristics; and the advantages, constraints and
other environments for applying the strategies. Different teacher participants tended to focus on different
strategies to meet their students’ grade levels. Following are a few quotations from participants who were
teaching different grade levels. A Pre-K ESE (exceptional student education) teacher participant used
sound cards with visual and movement information for pre-reading skills instruction: “I am a pre-k ESE
teacher and I do the letters and sing the day of the week with my students, everyday during circle time.
What I found helpful in recognizing the letters are sound cards I use that show action of each letter
sounds…The cards provides visual and movement to help remember a particular sound and letter”. In the
elementary and secondary levels, teacher participants focused more on the implementation of strategies for
fostering fluency and reading comprehension among their students. A second grade reading/literacy teacher
envisioned how she would implement the Column Notes for comprehension check: “I found the Column
Notes strategy useful and easy to understand. I can see the benefits of using this tool at any grade level.
For my second graders, I would use this as a before, during and after comprehension check for a story or
textbook chapter read. We usually do a “picture walk” of a story to get an idea of what it will be about.
After this, the students could write in the first column any questions they feel that they will have about the
story or the highlighted vocabulary words. The second column would be where they would answer the
questions that they had after reading and discussing the story as a class. The last column about “Me”
would be great for them to apply what they learned from the text to their own lives. It is a wonderful way
for them to write a real-life connection to the story, thus allowing for further comprehension. This would
also be a good activity to do with a partner. I will probably try it as a group first, and then with partners,
then individually, once I know that they understand the process behind the Column Notes”. A middle
school reading teacher used Exclusion Brainstorming for her remedial readers, grades 7-8, who were
reading on the 3rd-6th grade level. “I have never tried this (Exclusion Brainstorming) before. So, this
morning, before broaching our topic for the day which was “The History of Halloween,” I put several
words on the board around the topic. The students were less than successful in choosing the correct words
which matched the historical significance of Halloween. This was a great idea!! I then knew exactly what
I needed to cover today rather than spending (or wasting) time on concepts they already knew. We then
488
went on to read a collection of “scary” tales from the book, The People Could Fly and “The Tell-Tale
Heart” We described the significance of Halloween and scary stuff in history and how these beliefs have
influenced literature. (Funny…when I did this lesson, it didn’t sound so confusing!)”.
References
Bloom, B. S., & Krathwohl, D. R. (1956). Taxonomy of educational objectives: The classification of
educational goals: Handbook I, cognitive domain. New York: D. McKay.
Gunawardena, C. N., Lowe, C., & Carabajal, K. (2000). Evaluating online learning: Models and methods.
In Society for Information Technology & Teacher Education International Conference:
Proceedings of SITE 2000, 1-3.
Lieblein, E. (2000). Critical Factors for Successful Delivery of Online Programs. Internet and Higher
Education, 3(3), 161-174.
489
Moore, M. G. (1989). Three types of interaction. American Journal of Distance Education, 3(2). Retrieved
October 1, 2005, from http://www.ajde.com/Contents/vol3_2.htm#editorial
Nichols, W. D., Rupley, W. H., & Mergen, S. L. (1998). Improving elementary teachers' ability to
implement reading strategies in the teaching of Science content. In B. D. Sturtevant, J.; Linder, P.;
Linek, W. M. (Ed.), Literacy and Community (pp. 188-213). Carrollton, GA: the College Reading
Association.
Reigeluth, C. M. (1999). Instructional-design theories and models (Vol. II). Mahwah, NJ: Lawrence
Erlbaum Associates, Inc., Publishers.
Yin, R. K. (1994). Case study research : design and methods (2nd ed.). Thousand Oaks: Sage Publications.
490
Appendix1: Template for the SCAMPER strategy
491
A Case Study of Facilitators’ Attitudes toward Effectiveness of Different
Media Used in Online Degree Programs
Ulku Yilmaz
Cengiz Hakan Aydin
Anadolu University, Turkey
Abstract
This presentation consists of the results of a study in which facilitators’ attitudes toward
effectiveness of various media used in the Information Management Associate Degree Program of Anadolu
University, Turkey. The study has shown that although facilitators indicated that textbooks should still be
used in online courses, they found textbooks as being not efficient as multimedia programs and web
environments. The participant facilitators also found multimedia programs distributed on CDs more
efficient than web environment.
Introduction
Does not matter what form or model it is, every open and distance learning (ODL) initiative is
based on one or more media. The major theorists, such as Moore (1973) and Holmberg (1990), also
stressed on use of media as an absolute need in ODL. On the other hand, Clark (1983), argued that
“...media [technology] do not influence learning under any conditions” and that “...media are mere vehicles
that deliver instruction, but do not influence student achievement any more than a truck that delivers our
groceries causes changes in our nutrition” (p. 445). He criticized the meta- analysis of James Kulik and his
associates whose general conclusion was that computer-based technologies were as effective as, and in
some cases more effective than, traditional in-class instruction in improving academic achievement (e.g.
Kulik, Kulik & Cohen, 1980; Kulik, 1984). Clark has drawn attention on to the instructional design of the
learning materials rather than the media itself. Opposing Clark’s views, Kosma (1991) stated that media has
symbolic and process attributes. He indicated that how the symbols were presented had a direct bearing on
the perception and level of integration experienced by learners. Bates (1995) supported Kosma’s points and
emphasized the importance of technology decisions. Bates asserted that there are significant educational
and operational differences between various technologies, and the appropriate choice and use of
technologies mainly depend on the context in which they are used.
On the other hand, instructor presence in ODL is more critical, complex and challenging than
traditional educational environments due to its technology-based nature (Rudestam & Schoenholtz-Read,
2002; Spector & de la Teja, 2001). Faculty has to overcome potential barriers caused by technology, time,
and place that might be influential on their satisfaction, which, in turn, affects the success of the ODL
initiative. Correspondingly, Moore (2002) from the Sloan Consortium (Sloan-C) considers faculty
satisfaction as being one of the five pillars that support quality learning environments. Dziuban, Shea and
Arbaugh (2005), however, points out the shortage of empirical studies on faculty satisfaction in ODL.
Consequently, this paper summarizes the results of a study that focused on faculty satisfaction
about the technology used in an online learning program. In the following sections, firstly, the setting of the
study, or the Information Management Associate Degree Program (IMP) of Anadolu University, Turkey is
introduced and later other details of the study were given.
492
130 first year (freshmen) and 119 second year (sophomore) students. Of the students 54 percent are male
and average age is 26. More than 50 percent of the students do not have jobs. They, additionally, have
diverse computer experiences ranging from beginner to 15 years of professional work prior the program.
After registering the program either through the Internet or through the Open Education Faculty
Offices that are spread out the country (in 83 provinces), the students receive instructions on learning
processes as well as licensed software and instructional materials. The IM Program Guide consists of
information on the curriculum, instructional activities, interaction possibilities, support systems,
instructional materials, evaluation methods, software, and so forth. The Guide also includes sort of an
orientation for self-study (distance learning) methods. The students can also walk-in the 83 offices and get
help face-to-face. Additionally, web sites designed for supporting the students and online tools such as
newsgroups, chat and e-mail provide instructions and information about different aspects of the program.
Furthermore, a telephone line is available for the students 7 days in a week (16 hours each day). The
students especially need help on setting up the software and on studying at a distance prior the instruction.
So, the Guide, telephone line, the Offices, and online tools are mostly used to get help on these issues.
Moreover, Microsoft as the requirement of an agreement with Anadolu University provides licensed
software such as Windows, Office, FrontPage, Visio, Project, Publisher, and so forth to the students.
The students also receive instructional materials during the registration. The instructional materials
are web sites and tools, original software, textbooks, and video CDs. The web sites and tools help students
learn the content. The Computer-Based Instruction Center (CBIC) of Anadolu University has produced 25
modules of online learning environments that provide interactive presentation of information, examples and
practices. The information is presented in different verbal and visual formats such as text, narration, and
animation. The majority of the practices include multiple choice items with immediate feedback and links
to the related content. These environments have designed in a way that enables self-paced learning and easy
navigation.
The online tools serve for synchronous and asynchronous interactions among students, between
students and facilitators, as well as between students and organizational and technological support staff.
The students are able to interact synchronously (chat and forum) 4 hours in a day for each course with the
facilitators. They can also use synchronous tools (email) to get help from the facilitators and other staff.
There are 55 facilitators (academic advisors) employed primarily for providing the students academic
(instructional) support. Each facilitator is an expert in one course content. For each course there are 5-10
facilitators. They do not only answer the students’ questions but also evaluate the assignments. For every
course additionally there is a coordinator whose main responsibility is to help and supervise the facilitators.
The facilitators sometimes provide organizational and technical supports, too. However, there are staff that
help to solve the students’ technical and organizational problems online as well as via phone.
The students are also able to use video CDs produced by the CBIC. These CDs generally include
around 40 hours of animated demonstrations about how to use the software. The videos on these CDs are
also available online for the ones who have faster Internet connections.
In addition, a series of textbooks and e-books are provided to students as supplementary materials.
Textbooks are the products of a private company and can be bought in any bookstore in Turkey. Anadolu
University pays an amount of money to the publishing company for these textbooks. The e-books are the
products of Anadolu University.
Methodology
This study was conducted as a part of a large evaluation project that aimed to get feedback from
facilitators about the implementation of IMP. The project consisted of administration of a series of survey
questionnaires and semi- and/or un- structured interviews. One of these questionnaires helped the
researcher collect data for this study. Following section includes details about the participants, setting and
the instrument.
493
Participants
Facilitators (55) employed in IMP of Anadolu University were asked to take part in this study.
Only 2 of these facilitators did not participate due to personal reasons. As a result, the study was conducted
with the participation of 53 online facilitators. All the participants were working as graduate assistants or as
faculties in various colleges of Anadolu University besides working as facilitators in IMP. The majority of
the participant facilitators were graduate assistants (31 participants - 56.4%) while others (22 participants)
were experienced lecturers who have been teaching undergraduate level courses for a certain number of
years in different fields. It might be beneficial to give some details about graduate assistantship in Turkey
for the audience. To start with, the graduate assistantship is a profession in Turkey. In other words,
graduate assistants are employed as fulltime assistant faculties, whose main responsibilities are to assist
professors in their courses and research studies, as well as helping in the administration of departments.
Although it is not encouraged, sometimes graduate assistants also take responsibilities of undergraduate
level courses owing to shortage of professors. A big majority of the participant graduate assistants (28 out
of 31 – 90%) have been assisting professors for several years, and sometimes, they stand as substitutes in
lectures. Therefore, they can be considered as experienced in face-to-face teaching. Moreover, of the
participants, 11 (20%) were females, and most (45.5%) were between 25-29 years old.
Instrumentation
The IMP administrators bring all the facilitators together at the end of each semester to learn their
thought about the program and the support services provided them. It is noticed that just a few facilitators
attend these meetings and those who attend hesitate to express their ideas and satisfaction from the support
services. So, a survey questionnaire is selected as data collection method to seek input about support
services of the IMP for online facilitators.
The questionnaire included a Likert type 5-point scale, ranging from strongly disagree to strongly
agree (1-5), that were developed to measure facilitators’ attitudes toward each medium used in IMP. The
scale consisted of 7 self-report items concerning effectiveness, efficiency and attractiveness of each
medium. Another item asked the participants to what extent they think these media should be used in future
online programs and courses. For each medium a scale was provided. Therefore, the participants were
expected to check the number (1-5) on each scale that best reflects their feelings about the item regarding
that medium.
Results
The reporting of results is organized into four sections. The first section discusses the reliability of
the survey instrument. The second reports the results for research questions one while the third gives details
of the results for research questions two.
494
Table 1 Descriptive Statistics Regarding the Facilitators’ Perceptions about media used in IMP
Media
Functions of the medium Textbook CD Web
CA 0.88 0.68 0.85
N 53 53 53
Helping learning M 3.92 4.47 4.38
SD 0.96 0.57 0.74
N 53 53 53
Requiring less time M 3.49 4.51 4.45
SD 0.99 0.67 0.57
N 53 53 53
Spending less effort M 3.28 4.60 4.30
SD 0.93 0.57 0.75
N 53 53 53
Being entertaining and appealing M 2.79 4.49 4.13
SD 1.03 0.61 0.83
N 53 53 53
Accessing and using easily M 4.15 4.33 4.11
SD 0.93 0.81 0.75
N 53 53 53
Transferring the content accurately M 4.17 4.00 4.06
SD 0.97 0.71 0.74
N 53 53 53
Presenting the content attractively M 3.11 4.38 3.96
SD 1.05 0.66 0.92
N 53 53 53
Future implementations M 3.98 4.47 4.39
SD 1.18 0.97 1.06
As can be drawn from the Table 1, the participant facilitators think that a textbook is a medium
than can be accessed and be used easily, that transfer the content accurately and that help learners learn the
subject matter. In other words the facilitators believed that textbooks were effective and efficient media.
However, they found their attractiveness or appealing as sort of problematic. On the other hand, they
asserted that textbooks should still be used in online courses. Among all three media, multimedia programs
delivered on CD-ROMs got the highest scores at every item. Likewise, web materials were also found quite
effective, efficient and appealing. It was interesting to notice that the facilitators have shown a better
attitude toward CDs when comparing with web materials. This result may be related to the access problem.
Nowadays all the computers have CD-ROM drivers but it is hard to find fast Internet connections.
Conclusion
This descriptive study using survey data revealed that the facilitators in the IMP of Anadolu
University were satisfied with the effectiveness, efficiency and appeal of the multimedia programs
distributed on CDs and web materials. They also found textbooks effective and efficient although not as
much as other media. However, the study has shown that the facilitators did not think that textbooks were
appealing. Ironically, they indicated the necessity of textbooks in future online learning implementations.
495
So, it seems that textbooks are still adorable media for us and will be around for a while. In order to get
maximum benefit out of this medium the designers should find ways to make them more appealing and
interesting.
As it has mentioned before this study was conducted as a part of a large evaluation project that
aimed to get feedback from facilitators about the implementation of IMP. There were a series of
questionnaires, each of which included quite a number of items on different aspects of the program. This
situation could be effected on the participant facilitators during the administration of the program. So, a
separate study supported with qualitative methods might help getting better insight about how facilitators
think about the media used in an online program. These sorts of studies can help administrators of the IMP
and similar ones to improve their practices.
References
Bates, A. W. (1995). Technology, open learning and distance education. London: Routledge.
Clark, R. E. (1983). Reconsidering research on learning from media. Review of Educational Research, 53,
445-459.
Dziuban, C., Shea, P., & Arbaugh, J. B. (2005). Faculty roles and satisfaction in asynchronous learning
networks. In S. R. Hiltz, & R. Goldman (eds.), learning together online: research on asynchronous
learning networks (169-190). Mahwah, NJ: Lawrence Erlbaum.
Holmberg, B. (1990). The role of media in distance education as a key academic issue. In Bates, A. W.
(ed.), media and technology in European distance education, EADTU.
Kozma, R. B. (1991). Learning with media. Review of Educational Research, 61(2), 179-211.
Kulik, J. A. (1984). Evaluating the effects of teaching with computers: In G. Campbell & G. Fein (eds.),
Microcomputers in early education. Reston, VA: Reston.
Kulik, J.A., Kulik, C. & Cohen, P. (1980). Effectiveness of computer-based college teaching: A meta-
analysis of findings. Review of Educational Research, 2(2), 525-544.
Moore, M. (1973). Toward a theory of independent learning and teaching. Journal of Higher Education,
44, 661-679.
Moore, J. C. (2002). Elements of quality: the Sloan-C framework. Needham, MA: Sloan-C.
Rudestam, K.E. & Schoenholtz-Read, J. (2002). Overview: The coming of age of adult online education. In
Rudestam, K.E. & Schoenholtz-Read, L. (eds.). Handbook of online learning: Innovations in higher
education and corporate training (3-28). London: Sage.
Spector, J M, & de la Teja, I (2001) Competencies for online teaching. Retrieved May 5, 2004 from
http://ericit.org/digests/EDO-IR-2001-09.shtml
496
An Examination of Classroom Community Scale: Reliability and Factor
Structure
Huang-Chih Yu
Fengfeng Ke
Pennsylvania State University
Abstract
This study was intended to extend current work on the Classroom Community Scale (CCS) by
examining its internal reliability and factorial structure. Data were collected from 280 students at both
online and face-to-face classes. The results of the factor analysis indicated a satisfying internal reliability
(Cronbach's alpha is .78) of the total scale and a three-factor structure of important attributes for
classroom community: interdependence in learning, emotional support, and sense of membership.
Introduction
Learning communities are a growing feature in the education landscape. The power of community
to support learning has been well theorized or explored by many researchers (e.g. Brown & Duguid, 2000;
Palloff & Pratt, 1999; Lave & Wenger, 1991). Various descriptive studies on the development of learning
communities in formal schooling have also been conducted.
However, most of these studies report only community development efforts rather than results.
Without a workable evaluation tool, educators have difficulty indicating whether a phenomenon of
community has really emerged within their classrooms, or how this community phenomenon has developed
to support learning.
As an effort to tackle this difficult position, Rovai (2002) has developed the Classroom
Community Scale, which is an instrument to assess students’ sense of community and the extent of
community development. Rovai defines “sense of community” as consisting of two components: feelings
of connectedness among community members and commonality of learning expectations and goals (2002).
The CCS contains 20 five-point Likert-scaled items, ten items each for the subscales of connectedness and
learning. Rovai (2002) has field-tested the CCS with university graduate students enrolled in e-learning
courses, reported a high internal consistency of the total scale, and described how two subscale factors were
extracted from factor analysis. Since its publication the CCS has been cited or applied in quite a few
learning community studies (e.g. Anderson, 2004; Blignaut & Trollip, 2003; Brook & Oliver, 2003).
However, it should be noted that the CCS has been developed and applied only in online learning
environments. In addition, students involved in the CSS field-tests are only education-majored graduate
students. Rovai (2002) admits that caution should be exercised when generalizing the CCS to different
course environments or different student groups. Therefore, it remains questionable whether the existing
scale has psychometric credentials that are sound under different learning environments, or whether the
scale contains items that reflect the characteristics of different learner groups.
Method
The present study intends to extend current work on the CCS by examining its internal reliability
and factorial structure. Data sets from both online and face-to-face learning environment, from a mix of
graduate and undergraduate university students of different academic majors will be used in this article. By
doing a reliability and factor analysis of collected data, we are able to assess the CCS in the contexts of
multiple course environments, hence suggest directions for its future development.
Participant
In this study, participant population consists of a total of 280 university students who are enrolled
in five undergraduate and graduate courses in education, engineering, and business school at the spring
semester of 2005. Among them 135 are enrolled in two face-to-face courses while 145 are enrolled in three
Angel-based online courses. Situated in different learning environments and having different educational
backgrounds, the participants vary in age (from 20 years to 50 years), gender, and ethnicity.
497
Setting
The treatments used in this study comprise five education, engineering, and business graduate and
undergraduate courses offered by a major state university located in Midwest America. All courses are
regular credit courses, with three of them delivered via the Angel e-learning system. Instructor student
rations range from a low of 1:10 to a high of 1:300. Full-time experienced faculty members teach the
courses. It is observed that instructional designs and presentation styles of the five courses are different.
Among these five courses, two graduate e-learning courses from education school are teamwork and
dialogue oriented; one undergraduate e-learning course from engineering school is highly structured and
lecture-based; then the remained two undergraduate face-to-face courses from business school are more
balanced between lecture and teamwork interactions.
TABLE 1.
Items Factors
Interdependence Emotional Sense of
in Learning Support Membership
19. I feel confident that others will support me .87 .10 -.15
18. I feel that my educational needs are not being .83 -..56 .04
met
10. I feel reluctant to speak openly .83 -.15 .05
20. I feel that this course does not promote a .83 -.16 .01
desire to learn
13. I feel that I can rely on others in this course .81 .11 -.22
16. I feel that I am given ample opportunities to .81 .06 -.25
learn
14. I feel that other students do not help me learn .71 .07 .23
11. I trust others in this course .66 .28 -.16
498
7. I feel that this course is like a family .59 .33 -.08
12. I feel that this course results in only modest .53 -.37 -.20
learning
17. I feel uncertain about others in this course .09 .01 .71
9. I feel isolated in this course .02 .18 .70
8. I feel uneasy exposing gaps in my -.34 .15 .69
understanding
4. I feel that it is hard to get help when I have a .06 .30 .62
question
Eigenvalue 6.15 3.71 1.35
% of Variance Explained 30.75 18.57 6.75
Cronbach’s Alpha .90 .77 .70
Note: Item 6 did not account for salient factor loadings on interpretable factors.
Discussions
The study results indicated that CCS has a satisfying internal consistency for the whole scale, but
not for the two subscales (connecedness and learning). Then, the factor analysis demonstrated a three-
factor structure rather than the two-factor one suggested by Rovai (2002). Rovai’s field studies suggested
that connectedness and learning can be identified as two independent factors. But the result of this study
showed that these two attributes could be integrated into one dimension.
As Table 1 displayed, item 19, 10, 18, 20, 16, 13, 14, 11, 7, and, 12 account for a salient factor
that is defined as interdependence in learning, which refers to the combination of trust, shared needs (in
this case, learning), and exchange of influence. McMillan and Chavis (1986) defines that psychological
sense of community includes four main elements: membership, influence, integration and fulfillment of
needs, and shared emotional connection. In this study, the factor of interdependence in learning can be
interpreted as the dynamics between two senses of community elements – influence and needs. Item 3, 1,
2, 5, and 15 account for another factor called emotional support. In this study, emotional support means
not only emotional connection as described by McMillan and Chavis (1986), but also a mental support
between peers (i.e. peer encourage, caring, etc.). Finally, item 17, 9, 8, and 4 represent the third factor –
sense of membership, which means students in a classroom community, should first have “emotional
safety,” willingness to reveal how one really feels, and then develop a “sense of belonging.” In McMillan
and Chavis’s theory, both emotional safety and sense of belonging are indices of sense of membership.
In conclusion, we believe that the Classroom Community Scale can be used as a useful evaluation
tool for educational practitioners in future learning community development at both online and offline
learning settings. The whole scale has a satisfying internal consistency and its factor structure concurs with
the constructs defined by previous community theory. However, it should be noted that one item (item 6)
did not account for any salient factor, hence need to be dropped. Additionally, we suggest more items
accounting for emotional support and sense of membership should be added for the future development of
the CCS.
References
Anderson, B. (2004). Dimensions of learning and support in an online community. Open Leanring, 19(2),
183-190.
Blignaut, S. & Trollip, S. R. (2003). Developing a taxonomy of faculty participation in asynchronous
499
learning environments: An exploratory investigation. Computers & Education, 41(2), 149-172.
Brook, C. & Oliver, R. (2003). Online learning communities: Investigating a design framework. Australian
Journal of Educational Technology, 19(2), 139-160.
Brown, J. S. & Duguid, P. (2000). The social life of information. Boston: Harvard Business School Press.
Lave, J. & Wenger, E. (1991). Situated learning: Legitimate peripheral participation. Cambridge:
Cambridge University Press
McMillan, D. W. & Chavis, D. M. (1986). Sense of community: A definition and theory. American
Journal of Community Psychology, 14(1), 6-23.
Nunnally,J.C.,& Bernstein,I. H. (1994). Psychometrictheory (3rded.). New York:McGraw-Hill
Palloff, R. & Pratt, K. (1999). Building learning communities in cyberspace: Effective strategies for the
online classroom. San Francisco: Jossey-Bass.
Rovai, A. P. (2002). Development of an instrument to measure classroom community. Internet and Higher
Education, 5, 197-211.
500
Index
A F
Adams, Hope 393 Fellows, Geoff 288
Akdemir, Omur. 1 Freehling, Seth 37
Atkinson, Robert 7 Friesen, Steve 105
Avgerinou, Maria 18
Aydin, Cengiz 29, 439, 496
G
B Green, Marybeth 162
Baek, Eun-Ok 37
Bakar, Aysegul 54 H
Baran, Bahar 46, 54 Hanlon, Kathleen 18
Baumbach, Donna 488 Hay, Lyn 288
Beabout, Brian 270 Hedberg, John 322
Belland, Brian 151, 380 Hew, Khe 115
Benton, Denise 472 Hinojosa, Juan 374, 418
Blackman, Jay 380 Hsieh, Yi-Chuan 331
Brantley-Dias, Laurie 61 Hu, Haihong 174, 184
Brown, Lynette 71 Huang, Xiaoxia 71
Bulu, Saniye 78 Hung, David 322
Bulu, Sanser 78
I
C
Iliff, John 132
Cagiltay, Kursat 46, 54 Ionas, Gelu 105
Cahskan, Hasan 99 Ionas, Ioan 193, 203
Çalışkan, Hasan 91
Camin, Denise 151
Carroll, Margaret 18 J
Cernusca, Dan 105
Chang, Shujen 113 Jiang, Bai-Chuan 454
Chen, Huei-Yu 249 Jo, Il-hyun 316
Cheung, Wing 115 Jo, Il-Hyun 213
Ching, Yu-Hui 227 Johnson, Tristan 71
Choi, Hee 123 Jonassen, David 105
Choy, Doris 132
Cifuentes, Lauren 162 K
Cochenour, John 255
Connolly, Patrick 151 Kang, Haijun 220
Coulthard, Glen 151 Ke, Fengfeng 227, 501
Kealy, William 410
Keller, John 236, 249
D Khalil, Mohammed 71
Deimann, Markus 236 Kilic, Eylem 54
Dwyer, Francis 227 Kim, Aaron 244
Kim, Kioh 255
Kim, KyoungNa 270
E Kinuthia, Wanjira 61
Klein, James 445
Eastmond, Daniel 137 Koszalka, Tiffany 266
Ennis, Eleanor 339 Koszalka, Tiffany. 1
Ertmer, Peg 380 Kratcoski, Annette 422, 428
Ertmer, Peggy 144, 151
Eslami-Rasekh, Zohreh 331
Eustace, Ken 288 L
Land, Susan 270
Lee, Hye-Jung 277
Lee, Hyeon Woo 316
501
Lee, Mark 288 Shoffner, Mary 61
Lee, Miyoung 71 Simons, Krista 380
Lee, Youngmin 71 Simonson, Michael 137
Lei, Kimfong 151 Slate, John 399
Li, Lan 309 Smith, Brian 270
Lim, Kyu Yon 316 Spears, Cameron 410
Lim, Wei-Ying 322 Spelman, Maureen 18
Lin, Huifen 227 Steckelberg, Allen 309
Lin, Yi Mei 428 Sullivan, Howard 445
Liu, Chia-Ning 331 Sullivan, Michael 374, 418
Liu, Zhu 236 Swan, Karen 422, 428
M T
Merrill-Lusk, Mary 7 Tao, Yedong 488
Miyamoto, Masaru 343 Tasci, Deniz 439
Mohammed, Melissa 339 Tutty, Jeremy 445
Mong, Christopher 151
Motes, Gregory 410
Mukerji, Kerstin 266 U
Unger, Darlene 422
N
Nakatani, Momoko 343 V
Newby, Timothy 367 van ‘t Hooft, Mark 422, 428
Nkonge, Bessie 349
W
O
Wang, Ling 454
O’Connor, Debra 71 Ward, Rosalie 461
O’Neill, Eunhee 356 Westhoff, Guy 255
Ottenbreit-Leftwich, Anne 144, 367 Williams, Douglas 472
Ozaydemir, Nebiye 99 Woo, Jeong-Won 277
P X
Pan, Cheng-Chang 374, 418 Xiao, Judy 132
Park, Ji-Hye 123
Park, Sung 380
Park, Sunghyun 270 Y
Pedersen, Susan 472
Yang, Harrison 480
Yao, Yuanming 488
R Yeo, Jennifer 322
Yilmaz, Ulku 496
Reigeluth, Charles 387 Yonemura, Shunichi 343
Rezabek, Landra 255 York, Cindy 144, 367
Richardson, Jennifer 151, 367 Yu, Huang-Chih 501
S Z
Schaffer, Scott 380 Zygouris-Coe, Vassiliki 488
Schenker, Jason 428
Schlosser, Charles 137
Shen, Zuejun 393
502