Effects of Computer Troubleshooting On Elementary Students Probl
Effects of Computer Troubleshooting On Elementary Students Probl
ScholarWorks at WMU
8-2003
Part of the Elementary Education Commons, and the Instructional Media Design Commons
Recommended Citation
Ottenbreit, Anne Todd, "Effects of Computer Troubleshooting on Elementary Students' Problem Solving
Skills" (2003). Master's Theses. 3372.
https://scholarworks.wmich.edu/masters_theses/3372
by
A Thesis
Submitted to the
Faculty of The Graduate College
in partial fulfillment of the
requirements for the
Degree of Master of Arts
Department of Educational Studies
The lack problem solving skills exhibited by students has generated concerns
at national and state levels of education (Coleman, King, and Ruth, 2001). If the
transfer and could be easily included into the educational technology curriculum. The
and 'practice and drill' software. The proposed curriculum would be a new method to
meet national technology and math standards of education. Positive outcomes of the
The process ofmy master's degree was far more difficult than I had
and very appreciated all the way through the process. I have benefited educationally
from having Dr. Poole as my advisor and chair ofmy committee. I am also grateful to
Dr. Bosco, who was especially helpful in challenging my thoughts and ideas in order
to produce a better product. I am also grateful to Dr. Leneway who was especially
helpful and flexible throughout the entire process.
I would like to also thank both ofmy parents who offered, love and support
for her endless proofreading and assistance with my master's research project. My
father encouraged me to pursue my topic ofinterest and assisted in the design process
with great passion. My grandparents, sister, brother and the rest ofmy entire family
I would also like to thank Dr. Newlin-Haus for all ofher assistance with my
research design, statistical help and instruction on research methods. The HSIRB
sub-committee was extremely instrumental in designing my research design. Mary
curriculum. Western Michigan University was extremely helpful and provided three
111
TABLE OF CONTENTS
ACKNOWLEDGMENTS ...................................................................................... u
CHAPTER
Introduction ............................................................................................. 1
IV
List of Tables - continued
Background ..............................................................................................7
Summary ..................................................................................................7
Introduction.......................................... : ...................................................9
Introduction ...................................................................................14
Mathematics..................................................................................15
Introduction............................................................................................22
Introduction...................................................................................22
Curriculum ....................................................................................24
A+ Curriculum ..............................................................................30
Subjects..................................................................................................33
Vl
List of Tables - continued
Variables................................................................................................ 35
Dependent Variables..................................................................... 35
Treatment .........................................................................35
Attendance ............................................................ 35
Gender................................................................... 36
Testing Procedures............................................................ 38
Survey ...............................................................................47
Troubleshooting Activity..................................................49
Data Analysis.........................................................................................50
Hypothesis 1 .................................................................................50
Hypothesis 2 .................................................................................51
Vll
List of Tables - continued
Hypothesis 3 .................................................................................51
Hypothesis 4 .................................................................................52
Hypothesis 5 .................................................................................52
Hypothesis 6 .................................................................................52
Hypothesis 7 .................................................................................53
Summary ................................................................................................53
A. FINDINGS ........................................................................................................55
Introduction............................................................................................55
Female ...........................................................................................58
Student 2 Experimental.....................................................60
Student 4 Experimental.....................................................64
Student 6 Experimental.....................................................68
Male .............................................................................................. 70
Student 8 Experimental.....................................................72
Student 9 Control.............................................................. 74
Vlll
List of Tables - continued
Student 10 Experimental...................................................76
Student 11 Control............................................................78
Student 12 Experimental...................................................80
Hypothesis l ...................................................................................81
Troubleshooting Activity..................................................82
Station #1 ....................................................................83
Station #2 ....................................................................84
Station #3 ....................................................................85
Station #4 ....................................................................85
Station #5 ....................................................................87
Station #6 ....................................................................87
Station #7 ....................................................................88
Hypothesis 2 ....................................................................................91
Hypothesis 3 ....................................................................................98
Survey/Group Interview....................................................99
Hypothesis 4 ..................................................................................101
lX
List of Tables - continued
Hypothesis 5 ..................................................................................102
Hypothesis 7120
Hypothesis 1 ...................................................................136
Hypothesis 3 ...................................................................139
X
List of Tables - continued
Hypothesis 4 ...................................................................139
Hypothesis 5 ...................................................................140
Hypothesis 6 ...................................................................141
Hypothesis 7 ...................................................................143
Conclusions...... .................................................................................144
BIBLIOGRAPHY ................................................................................................159
APPENDICES
Xl
L ist of Tables - continued
H. Interview.............................................................................................. 180
Xll
LIST OF TABLES
Xlll
List of Tables - continued
xv
List of Tables - continued
XVI
LIST OF FIGURES
xvn
List of Figures-continued
XVlll
List of Figures-continued
XIX
CHAPTER 1
Introduction
The lack of higher level thinking skills used by students has become an area of
concern at national and state levels of education (Coleman, King, and Ruth, 2001).
ability in elementary students (MacPherson, 1998). The purpose of the research study
researcher had the ability to affect elementary students' problem solving ability.
and 'practice and drill' software. Computer troubleshooting and technology have
lacked integration at the elementary level (Poris, 2000). Evidence acquired from this
research will lend credence to the possible incorporation of this type of training
program.
1
2
General Statement of the Problem
skills is essential for the future. The importance of problem solving is illustrated in
the quote, "Give a man a fish; you have fed him for today. Teach a man to fish; and
you have fed him for a lifetime"-Author unknown. By teaching students how to
solve problems as opposed to supplying them with content knowledge, students will
be able to successfully solve life problems, and therefore be successful in life (Casey
Research Hypothesis
and team worksheets were evaluated to establish whether students were able to solve
This hypothesis was evaluated through the comparison of the POPS Methods
Used section pre-test and post-test results. The hands-on problem solving test was
What aspect ofproblem solving is the most difficult for elementary students?
The most difficult procedure in problem solving for elementary students will
through a review of literature and articles. Data was collected through the Profiles of
Problem Solving test and the surveys/interviews in order to analyze the most difficult
procedure. The standardized problem solving test, POPS, divided the evaluation into
were compared to find the weakest area. Data was also collected from the
The IOWA Basic Skills Test and the Group Interview were used to assess
mathematical problem solving ability. Comparisons between the pre-test and post
test IOWA scores of each group were analyzed. Responses from the group interview
Gender differences were compared through the POPS total scores, individual
categories within the POPS test and the hands-on problem solving test. Comparisons
between genders and between the experimental and control groups of genders were
Does teacher-rated problem solving ability impact the problem solving ability
Students in the experimental group and control group were matched based on
teacher-rated problem solving ability. When students were divided into teacher-rated
Problem solving ability groups were compared through the POPS total scores,
individual categories within the POPS test and the hands-on problem solving test.
will demonstrate greater improvements in problem solving ability than students who
The results from the POPS test, the hands-on problem solving test and the
experimental group achieved higher results in the post assessment than the students in
the control group. Data was also collected and analyzed from each section of the
POPS test.
LOGO programming have the ability to develop problem solving skills and improve
academic achievement (Kurland, 1986). This project used computer repair and
through computer troubleshooting and repair training. With this research, elementary
schools may incorporate the training program into the educational technology
program in order to enhance the learning. This program has the potential to inform
problem solving skills. The training session, if proved effective, could provide a new
At this time of this study, no current research on the proposed topic was
found. Computer troubleshooting and repair had been attempted at the high school
level with great success, but no relationship to problem solving skills had been tested
or proposed. With the present technical focus of today, students need to have the
ability to solve everyday problems with technology. The researcher believes that
two separate parts of the curriculum; the students would be improving their problem
solving skills, while preparing for the future and learning an authentic, meaningful
lesson. Therefore, the research would establish baseline data for instructional
Summary
The lack problem solving skills exhibited by students has generated concerns
at national and state levels of education (Coleman, King, and Ruth, 2001). If the
transfer and could be easily included into the educational technology curriculum. The
REVIEW OF LITERATURE
Introduction
brings into play your inventive faculties, and if you solve it by your
own means, you may experience the tension and enjoy the triumph of
mental work and leave their imprint on mind and character for a
solving skills allow students the opportunity to solve problems that arise in everyday
curriculum because of their application towards success and life (Jonassen, 2000; Lee
individual in society and educational institutions are responsible for this preparation
(Lee, 1996).
9
10
every individual. Coleman, King, and Ruth state "by not challenging students, nor
encouraging them to use higher order thinking skills, educators underestimate their
students' abilities and delay meaningful grade-level work, as well deprive them of a
significant environment for learning" (2001, p.9-10). Currently, students are not
learning to read and scientific discovery. Most schools, however, make little attempt
to provide students with the assistance they need to learn a broad range of problem
solving strategies. Instead, most schools tend to use formal training in problem
method most teachers use to integrate problem solving into the curriculum has been
Mathematics (1989) presents problem solving as one of the most vital skills for
students. The New Jersey Board of Education believes "problem posing and problem
solving involve examining situations that arise mathematics and other disciplines and
will come to realize the potential usefulness of mathematics in their lives" (as
reported in Poris, 2000, p.1 ). Problem solving in mathematics should not be limited
to traditional word problems, but should be taught through methods of inquiry and
application in order to expose the students to the multiple facets of problem solving.
11
Mathematical problem solving follows the same requirements for basic problem
strategy or plan, canying out the plan, and reflecting on the process. As reported in
multiple studies, when students are able to perfect this process in mathematics, they
will be able to apply the same process to other problems with success (Paris, 2000).
parents, educators, business leaders and university professors. The frameworks listed
2003).
problems that interest, challenge, and engage them in thinking about important
mathematics. The NCTM declares problem solving as a process that should develop
from mathematics and provide a framework in which concepts and skills are learned
(NCTM, 2000). The NCTM website presents various problem solving activities that
are educationally appropriate for fifth grade students and document the processes the
attempting to solve a problem. The problem solving methods that will be utilized in
this research project consists of four basic steps: (a) understanding the problem; (b)
planning a solution; (c) solving the problem and (d) looking back.
Figure 2.1
The Problem Solving Processes Used as Standard for the Purposes of this Research
Study
Strategies may vary in name, however, most fall into one of the following
basic categories: (a) compute or simplify, (b) use a formula, (c) make a model or
diagram, (d) make a table, chart or list, (e) guess, check and revise, (f) consider a
simpler case, (g) eliminate and (h) look for patterns (Math Counts, 1999).
problems. Whether the problem is achieved with mental abilities or external physical
Bodner and Domin, 2000). The ancient Chinese philosopher Lao Tzu illustrates the
influence of hands-on learning on the learning process through the statement, "What I
regular basis and are the most difficult to prepare students to solve. The solutions are
neither easy nor predictable, requiring the individual to use multiple processes in
order to resolve them. The best way to prepare students for ill-structured problems is
to equip them with problem solving skills through the practice of well-structured
problems. Students need to utilize, implement and apply problem solving strategies
and content knowledge in order to solve familiar everyday problems (Howard, McGee
& Shin, 2001). Real world situations allow students to develop a profound
(Howard, McGee & Shin, 2001). Through the provision of real-world learning
valuing and often a desire to learn more" (Howard, McGee & Shin, 2001, 52).
Introduction
abilities in problem solving, and in specific subject area. Problem solving has been
subjects areas.
15
Scientific Inquiry
studies, students were required to evaluate the problem, brainstorm possible solutions
and solve the problem. Through questioning, immediate feedback and investigation,
students are able to form their own knowledge on the topic by solving the problem
Mathematics
multiple levels. When mathematical problem solving was tested for transfer effects
on other subject areas, computer programming and strategy games were found to be
Reading Recovery
to provide students with the ability to solve their own reading difficulties through a
heuristic similar to problem solving. Studies using Reading Recovery have shown
success in reading with the problem solving heuristic (Wayne & Johnstone, 1997).
1992). Studies have also shown that computer problem-based multimedia software
16
increases understanding and the use of problem-based concepts (Sherwood, 2002).
Technology Education
m today's schools, mainly due to its ability to integrate into the educational
curriculum.
troubleshooting curriculum used in the research project was designed to meet the
ISTE Profiles for Technology Literate Students and the Michigan Curriculum
solutions by using resources and processes to create, maintain, and improve products,
students. The organization divided the standards by grades levels. The performance
objectives for students include: (1) basic operations and concepts, (2) social, ethical,
17
and human issues, (3) technology productivity tools, (4) technology communications
tools, (5) technology research tools and (6) technology problem-solving and (7)
decision-making tools. Many of the performance objectives at the third through fifth
grade level demanded problem solving ability, indicating the importance of problem
However, in the majority of studies reviewed that argued against positive effects of
technology (Goldman, Cole & Syer, 1999; Chaika, 1999; Wenglinsky, 1998) lacked a
educational applications and uses within the curriculum. Previously, computers and
graphical image producers. However, computers are now being seen as powerful
tools that can enhance and assist the learning process (Poris, 2000).
females and males. Computer achievement, attitudes, and anxiety were multiple areas
gender benefits more from technology (Burge, 2001; Hackbarth, 2002; Fey, 2001;
(Coleman, King, Ruth, 2001, p.10). Technology also has the ability to facilitate
questioning, feedback, reflection and revision for· students when they learn
scaffolding environment where the students are teaching one another (Driscroll,
2002).
students are excited to use the technology, and therefore, look forward to learning
with technical equipment (Waetjen, 1993). Students are able to demonstrate more
effort and process material at a more meaningful level when they are interested and
believe that they have the ability to solve the problem. Computer repair and
students to interact with the information and receive immediate feedback (Jonassen,
2000).
become synonymous with problem solving due to the constant problems associated
problem solving. Both troubleshooting and problem solving require documenting the
the development of the individual's own strategies. In most technical problems, the
initial problem and desired result are easily established, but the solutions to achieve
the end result are often difficult and numerous (Lee, 1996). Students frequently
overlook the important steps in problem solving. Technical problems often require
these important steps such as; multiple solutions and complete analysis of the
problem (Lavonen, Meisalo & Lattu, 2001). Technology troubleshooting requires the
ability to diagnose problems and to test out the possible solutions. Teaching problem
solving skills through technology education will enable the student to achieve
troubleshooting. Jonassen believes " ... troubleshooting is among the most common
20
forms of everyday problem solving" and through a computer troubleshooting
curriculum, students could learn these two skills at once (2000, p.73).
CHAPTERIII
METHODOLOGY
Introduction
design. The whole study lasted two school weeks, including all testing periods. The
effect on problem solving skills. Each group included three boys and three girls,
hands-on problem solving test, and a survey collecting information from the students
on problem solving skills, attitudes and math abilities. Following the two days of pre
which were held for forty-five minutes in the morning before school over the course
of five days. The control group received no training. Following the training, the
students from the control and experimental group were post-tested. The post
assessment included two standardized tests, the hands-on problem solving test and a
group interview modeled from the survey. The pre-tests and post-tests were
compared through statistical analysis, graphs and tables, as well as focusing on each
21
22
Research Setting
Introduction
will be analyzed at Western Michigan University, the location of the actual study is at
Graduate Assistant EDT 347, Master's Student), who has used the research in
conjunction with her master's thesis and Sharon Ottenbreit (Dearborn Public
compliances through the Dearborn Public Schools. The other expert mentioned later
on, Joel Ottenbreit (Ann Arbor Public Schools), has knowledge of the A+ curriculum
processes.
status community. The area consists of mainly white individuals, however, a large
majority of the students within the white demographic are of Arabic descent.
23
Figure 3.1
Dearborn Demographics
income for families in Dearborn is $53,060. According to the census survey taken in
1999, twelve percent of the families in Dearborn are below the poverty level (United
School Information
Dearborn Public School District. The Dearborn School district is a large district
kindergarten through fifth grade. Four percent of students who attend Haigh
student Ethnicity
Curriculum
All fifth grade students at Haigh currently use one standard mathematics
curriculum and one standard technology curriculum. The technology curriculum was
The mathematics curriculum currently uses Mathematics Plus as the standard format
Technology Curriculum
regarding the strengths of the current technology education curriculum. More than
majority (87%) of the students responded that they enjoyed using computers a lot.
According to the teachers, the usage of the computer labs greatly increased from the
original survey in 1992 (45%) to the most current survey in 2001 (80%). Students use
school computers for educational purposes such as; drawing, painting, writing stories
and reports, educational programs, encyclopedia work, the Internet, games, and for
Math Curriculum
problems based on the fifth grade math book MATHEMATICS PLUS, by Harcourt
Brace and Company. The current math problem solving curriculum is comprised of
multi-step problems, relevant and irrelevant information and evaluating answers for
KNOWLEDGE
• Verify students' ability to start and quit applications
• Verify students' ability to log onto file server
• Verify students' ability to appropriately care for disks/CD's
• Reinforce students' ability to perform a warm start
• Reinforce students' ability to check cords and cables
• Develop students' ability to select hardware/software applications for tasks
• Develop students' ability to describe how technology meets human needs in the home,
school, community and workplace
• Develop students' ability to describe how people create use and control technology
• Develop students' ability to identify technology related careers
• Develop students' ability to describe advances in technology and their impact on society
• Develop students' ability to identify the computer hardware components
• Explore students' ability to use multimedia software
• Verify students' ability to understand what the CPU, monitor, keyboard, mouse and data
storage drive are
• Reinforce students' ability to use the printer
• Reinforce students' awareness of the network versus the stand alone computer
• Develop students' ability to understand/use special keys (ESC, CTRL, etc... )
• Develop students' ability to use touch typing method
• Develop students' ability to know and use icons
• Master students' ability to understand and use menus, function keys and buttons
PROBLEM SOLVING
• Develop students' ability to identify problems; find ways in which computers can solve
problems
APPLICATION
Word Processing
• Develop students' ability to manipulate font (size, style, etc... )
• Reinforce students' ability to print
• Reinforce students' ability to use spell check, and thesaurus
Internet
27
• Explore students' ability to use a browser
• Explore students' ability to enter a site location
• Explore students' ability to retrieve electronic information
• Explore students' ability to cite internet references
• Explore students' ability to demonstrate responsible use of the internet by adhering to the
district's Internet Usage Policy
• Explore students' ability to use a search engine
GENERAL SKILLS
• Develop keyboarding skills
• Verify students' ability to point/click
• MasterNerify students' ability to click and drag
• Develop students' ability to insert/delete
• Develop students' ability to save
• Develop students' ability to copy, cut and paste
SOCIAL AND ETHICAL ISSUES
• Develop students' ability to describe the impact of technology on daily lives
• Develop students' ability to identify ways various technology is used in the home,
school, community and workplace
• Develop students' ability to respect privacy and ownership of individual or organization
information or product
• Develop students' ability to articulate that individuals are responsible for their
technological actions and decisions
• Introduce students' ability to adhere to copyright, patent, and freedom of information
laws related to using technology
• Develop students' ability to describe how technology impacts information, information
access, analysis, organization and utilization
students must learn. First, students learned how to use a heuristic, which is a guide
for thinking. The textbook promotes the following process; (a) understand the
problem, (b) plan a solution, (c) solve the problem, and (d) look back. The
tables, charts, and graphs; (b) make a list, (c) guess and check, (d) find a pattern, (e)
draw and use pictures; (f) make and use models; (g) write a number
sentence/equation, (h) work backward, and (i) solve a simpler problem. The
apply the heuristic guide they have learned. The Mathematics Plus textbook includes
a problem of the day in every lesson, asking students to apply different skills for
Research Design
solve and academic achievement prior to and following the computer troubleshooting
of Problem-Solving Standardized Test (POPS), IOWA Basic Skills Math Test, the
hands-on problem solving test and a group interview following the training sessions.
The students' abilities to solve computer problems were also analyzed. This study
was conducted using six students for the experimental group and six students for the
research design with a control and experimental group. The experimental group
received the treatment and both groups were pre-tested prior to the training, and post
tested after the training was complete. The entire project protocol took place at Haigh
Elementary School over the course of two weeks. The researcher chose to perform
29
the study over the course of two weeks in order to reduce the external validity issues
associated with the effects of information they gained from school. Subject
recruitment started after the HSIRB approval and the beginning of the baseline data
collection was initiated on May Ii\ 2003. A timeline of the training sessions is
Figure 3.4
Week 1:
Day 1: Pre-Testing
Day 2: Pre-Testing/Getting to Know You and the Computer
Day 3: Lesson 1: Outer Hardware, Intro to Hardware on the Inside
Day 4: Lesson 2: Hardware on the Inside
Day 5: Lesson 3: Storage, Files and Folders, The Windows Desktop
Week 2:
Day 1: Lesson 4: Knowing Your System, Programs, Operating Systems, Computer Care and
Safety
Day 2: Lesson 5: Troubleshooting Real Problems
Day 3: Post-Testing
Day 4: Post-Testing
The primary purpose of the computer training sessions was to provide students
The experimental group received the training for forty-five minutes, over the
course of two weeks on May 21 st, 22nd, 23rd, 2ih and 28 th. The curriculum was
A+ Curriculum
maintenance and basic networking. A+ certification provides the perfect outline for
2002).
The training sessions were designed to mimic the CompTIA manual, but
modified the program to meet the needs of elementary students. The training sessions
how to troubleshoot basic problems associated with hardware and software, and other
functions associated with the computer. These learning objectives assisted in the
in figure 3.5.
31
Figure 3.5
• Recognize and be able to state the name and purpose of each hardware element as
listed below
Motherboard
Power Supply
Processor /CPU
Memory
Storage devices
Monitor
Modem
Peripheral
BIOS
CMOS
LCD (portable systems)
Ports
PDA (Personal Digital Assistant)
Kids Domain.Com offers a large variety of child friendly activities and lesson
plans. The explanation, organization and presentation of the material was targeted
training sessions was adapted from the Kids Domain website. The researcher
contacted the company for permission to use the material, and was granted permission
by the company through email. The email documentation can be found in Appendix
V.
Instructional Method
situations. These authentic learning situations supplied students with the opportunity
to apply their new skills and solutions, while receiving immediate feedback. The
students were given handouts and other supplementary learning tools to enhance
learning within the curriculum. The students were instructed through individual, team
problems and different solutions were presented to the whole group. Teams of two
students were implemented for all hardware and software installations in order to
facilitate questioning and answering between pairs. This configuration was able to
fully utilize the equipment to produce the maximum learning experience for the
Subjects
perm1ss10n slip and parent consent form. Students without these documents
completed were not eligible to participate. Out of the students who volunteered to
participate in the training, the students were selected using specific criteria. The first
disaggregate was gender, due to the vast differences in computer and technical ability
found in previous research in relation to gender (Fey, 2001; Frantom, Green &
Hoffman, 2002; Suomala & Alajaaski, 2002). The researcher first selected three girls
34
and three boys for the experimental group and three girls and three boys for the
control group. The second disaggregate was based on the students' school attendance
record over the past year. Students with a good attendance record were the most
desired since absences and tardiness could have affected the end results of the project.
The third disaggregate was the level of problem solving abilities. The researcher had
the fifth grade teachers rank the students on their problem solving ability; low,
medium, and high. The researcher did not attempt to collect any particular scores;
rather, the focus of the level of problem solving ability was used to match students
between the experimental and control groups. To ensure the lack of favoritism or
perception of favoritism in selecting students for the experimental versus the control
group, the matched subjects were randomly assigned to groups. Though recruiting a
true random sample was extremely unlikely, this procedure achieved the closest
students.
The student subjects were not selected, as all of the students in the fifth grade
participation was required. Students involved in the control group were selected
based on; gender, attendance and similar problem solving skills compared to students
in the experimental group in order to decrease the possible threats to the validity of
the data.
35
The student subjects were not selected, as all of the students in the fifth grade
selected based on; gender, attendance and similar problem solving skills to students in
the control group in order to decrease the possible threats to the validity of the data.
Variables
Dependent Variables
Treatment
exhibited no favoritisms or other forms of biases during the sessions. The videotape
was viewed and evaluated at a later time by an A+ expert, Joel Ottenbreit. The expert
Attendance
Attendance was a large factor for selection of subjects. The training sessions
contained so much information, it was pertinent that the students attend all training
36
sessions and were punctual. The teachers were asked to rate the students based on
attendance and tardiness over the past semester. All students were rated based on
good attendance and number of tardies. The students rated with a high number of
tardies and absences were dismissed from the study, in order to decrease as many
variables as possible.
The researcher had the fifth grade teachers rate the students' problem solving
abilities as; (a) high, (b) medium, or (c) low. Once subjects with multiple tardies and
absences had been dismissed from the project and the research subjects had been
Gender
The researcher selected students based on gender due to the large difference
between boys and girls in learning and computer activities. The researcher was able
to selected three girls and three boys for each group in order to balance the groups for
companson purposes. The information was collected from the permission slips
returned by the students who wanted to participate in the study. The researcher
randomly selected three pairs of matched problem solving ability girls and three pairs
of matched problem solving ability boys and placed one student from each pair in the
POPS Test Results Compared Between POPS Administrative Test and the Computer
Troubleshooting Study Students
r-- - ----- -A·verages Compared Between POPS Administrative Test Mean Scores
�
C Computer Troubleshooting Study Students
■ POPSAdministrativeStudyStudents and the Computer Troubleshooting Study Mean Pre-Test Scores
•
••
Ill
I:
•
POPS Administrative Test, preformed by the testing company involving 371 subjects,
the subjects involved in this research project scored lower in the Correctness of
Answer, Methods Used and Extracting Information sections. However, they scored
higher, on average, in the Accuracy and Quality of Explanation sections, than their
peers involved in the POPS Administrative Test. However, most differences were not
substantial, and therefore, students chosen were representative of their peers based on
There were multiple independent variables which affected the results of this
project. For each method of assessment, the students received a code test cover sheet
indicating the students' name, code name and method of assessment (See Appendix
r). These measures were for organizational purposes only. The tests and other forms
of assessment were identifiable only by code numbers. The master code list was only
Testing Procedures
Before every test or training, students were read the student assent form,
indicating they would receive no extra credit, and even if they agreed to participate
they could change their minds at any time throughout the testing or training (See
Appendix B). They were also reminded that they were volunteers and were free to
stop participating whenever they chose without any penalties for quitting.
Once the method of assessment was completed, the students were instructed to
notify the researcher. The researcher would then return the answer sheet and test
booklet into a manila folder. The students who finished were given the next
The students completed the rowA Basic Skills Math Problem Solving and
Data Interpretation twenty-six item section of the rowA test. The mathematical
to complete and was conducted in Room 4 of Haigh Elementary School under the
39
superv1S1on of a Dearborn Public School teacher. The teacher, as well as the
researcher was available to help read the items and answer questions. The researcher
read the assent form to the students before testing. To ensure confidentiality, a cover
sheet was included listing the student's name and ID number. The student's converted
ID number was the only form of identification on all assessments. The converted ID
number was randomly assigned by the researcher. The code sheet containing the
student's name, student identification number and the test score will be kept in a
locked file cabinet in a university office that only researchers will have access to. The
researcher explained each test's instructions, asked the students to complete the
standardized tests and raise their hands when finished. As each individual completed
the first test, the researcher distributed the second standardized test, POPS, with the
code sheet and explained the instructions individually. Once the tests were
completed, the assessment was placed in the corresponding folder for organizational
purposes. The second problem solving test, POPS, took approximately thirty minutes
to complete and was administered in the same manner on the same day. Due to time
constraints, the students were able to take two days for pre-testing and two days for
post-testing. The tests were collected in the same manner as the first standardized
test. Throughout the testing period, students were randomly asked to take the hands
on problem solving test, which was videotaped in the back of the classroom. Once
the students had completed all three tests, the students were given a self-assessment
survey. However, due to the time constraints of two days for pre-assessment and two
days for post-assessment, all students were not able to finished the survey.
40
ETS Services
The researcher used two of the five evaluation materials from the ETS Test
Collection. The Educational Testing Service Test Collection Database allows access
The IOWA Test of Basic Skills Math Problem Solving and Data Interpretation
experimental group and control group both were pre-tested before school in Room 4.
The students were given instructions from the IOWA Test of Basic Skills Teacher's
Manual. The test took approximately thirty minutes for each student to complete,
choice standardized testing answer sheet. The answer sheets were collected and
scored by the researcher using the IOWA Test of Basic Skills answer key. The data
addition, the data will be part of the formative evaluation process to assess
The researcher chose to use the IOWA Basic Skills Math Test to measure the
from the Mental Measurements Yearbook Review Online (Figure 3.8). The
researcher only used the Math Problem Solving and Data Interpretation section of the
test, in order to keep testing time to a minimum. Brief information concerning the
41
IOWA test is located below, but a full description, including reviews 1s located in
Appendix M.
Figure 3.7
PLEASE READ THESE TERMS OF USE CAREFULLY BEFORE USING THE TEST
COLLECTION DATABASE. BY USING THE DATABASE, YOU AGREE TO THESE
TERMS OF USE. IF YOU DO NOT AGREE TO THESE TERMS OF USE, PLEASE DO
NOT USE THE DATABASE.
"The ETS Test Collection provides microfiche copies of certain unpublished test as a service to
educators and psychologists. It is hoped that these materials will provide users with creative ideas
for the development of their own instruments, or, in some instances, with measures of attributes for
which no published tests are available.
The materials included on the microfiche may be reproduced by the purchaser for his own use until
otherwise notified by ETS or the author. Permission to use these materials in any other manner must
be obtained directly from the author. This includes modifying or adapting the materials, and selling
or distributing them to others. Any copyright notice or credit lines must be reproduced exactly as
provided on the original.
Typically, the tests included in this service have not been subjected to the intensive investigation
usually associated with commercially published tests. As a consequence, inclusion of a test does not
imply any judgment by ETS of the quality or usefulness of the instrument. The purchaser must
assume full responsibility for controlling access to these materials, the manner in which they are
used, and the interpretation of data derived from their application.
It is recommended that access to these microfiche be limited to students conducting research, staff
members of professionally recognized educational and psychological institutions or organizations,
and individuals who are members of the American Educational Research Association, the American
Psychological Association, the National Council on Measurement in Education, or the Association
of Measurement and Evaluation in Guidance. The qualifications of others not in these categories
should receive careful consideration.
42
Finally, the purchaser is urged to provide information about his use of these materials directly to the
authors. Many cooperating authors are interested in collecting data on their instruments, which will
make them more useful to others. Therefore, it is to the advantage of everyone concerned - authors,
present users, and users in the future - that purchasers recognize their professional responsibility to
initiate such communication. The address of the author of each instrument as of the date on which
the series is released is listed on this notice that appears frrst on each download test."
(http://testcollection.ets.org/cgilswebmnu.exe?ini = TESTCOLL&act = 3&/ang=&uid=pub/ic&idck=
&eid = &tid = 8955229401-0j)
Standardized Test. Students began this test after completing the IOWA Basic Math
Skills Test and were given individualized instructions for the test from the researcher.
The 6-question test took approximately 20-25 minutes for each student to complete
and was given in the same room as the IOWA Basic Math Skills. The students
recorded their answers on an answer sheet, explaining their answers with drawing,
words and number sentences. The answer sheets were collected and scored by the
Figure 3.8
A. Purpose
"To provide a comprehensive assessment of student progress in the basic skills."
B. Population
Grades K.1-1.5, K.8-1.9, 1.7-2.6, 2.5-3.5, 3, 4, 5, 6, 7, 8-9 ...LE-10: 5, 6, 7, 8, 9, 10, 11, 12,
13, 14.
C. Scores
Vocabulary, Listening, Language, Language Total, Mathematics, Core Total, Word Analysis
(optional), Mathematics Advanced Skills, Mathematics Total, Reading Advanced Skills,
Reading Total, Reading, Listening Language, Mathematics Concepts, Mathematics
Problems, Mathematics Computation [optional], Social Studies, Science, Sources of
Information, Composite, Language Advanced Skills, Mathematics Advanced Skills, Survey
Battery Total, Reading Comprehension, Spelling, Capitalization, Punctuation, Usage and
Expression, Mathematics Concepts and Estimation, Mathematics Problem Solving and Data
Interpretation, Mathematics Total, Maps and Diagrams, Reference Materials, Sources of
Information Total, Composite.
D. Time
(130-310) minutes for Complete Battery; (100) minutes for Survey Battery
E. Comments
Part of Riverside's Integrated Assessment System; Braille and large-print editions available
44
Reading Total, Reading, Listening Language, Mathematics Concepts, Mathematics
Problems, Mathematics Computation [optional], Social Studies, Science, Sources of
Information, Composite, Language Advanced Skills, Mathematics Advanced Skills, Survey
Battery Total, Reading Comprehension, Spelling, Capitalization, Punctuation, Usage and
Expression, Mathematics Concepts and Estimation, Mathematics Problem Solving and Data
Interpretation, Mathematics Total, Maps and Diagrams, Reference Materials, Sources of
Information Total, Composite.
D. Time
(130-310) minutes for Complete Battery; (100) minutes for Survey Battery
E. Comments
Part of Riverside's Integrated Assessment System; Braille and large-print editions available
Test and were given individualized instructions for the test from the
researcher. The 6-question test took approximately 20-25 minutes for each student to
complete and was given in the same room as the IOWA Basic Math Skills. The
students recorded their answers on an answer sheet, explaining their answers with
drawing, words and number sentences. The answer sheets were collected and scored
The researcher chose the test due to the recommendation obtained from the
Mental Measurements Yearbook Review Online (Figure 3.9). Once the testing
agency was contacted in order to purchase the test, the agency informed the researcher
the name of the test was changed from the Surveys of Problem Solving (SPRS) to the
The POPS test divided each student evaluation into five separate categories.
The first category, Correctness of Answer, was a measure of whether the answer was
correct. The Methods Used category measured the approach used, focusing more on
45
the plan rather than the calculations. The more systematic plan the student
developed, or if the student used a pattern, the higher the score. The Accuracy
category measured the ability to calculate, focusing more on the mathematical aspect
which the student understood the problem, relevant facts and relationships between
variables. Lastly, the Quality of Explanation category measured the student's ability
to communicate the problem solving process. Each category was present in multiple
questions and graded separately for each student. Every category was divided into
student was graded on the pre-test and post-test, using the three possible levels
Brief information concerning the POPS test is located below (See Figure 3.9)
Throughout each testing period, students were randomly asked to take the
hands-on problem solving test, which was videotaped in the back of the classroom.
The researcher allowed students ten minutes to complete two activities. The hands-on
problem solving test was videotaped in order to provide evidence of problem solving
methods the students used. The researcher reviewed the hands-on problem solving
test at a later time, evaluating through the use of the hands-on problem solving rubric
(See Appendix J). The hands-on problem solving test was based off of problem
solving activities using tangrams. The researcher established two activities for the
46
Figure 3.9
F. Purpose
"An assessment of mathematical problem solving designed for children in upper primary school".
G. Population
Grades 4-6.
H. Price
Group
J. Scores
[32]40 minutes
M. Comments
researcher established two activities for the hands-on test; an easy and difficult
tangram problem. The students had a maximum of five minutes to complete each
problem to the best of their ability. The post test each student was given utilized the
47
same tangram set as the pre test. The book and tangrams used can be found in the
Appendix 0. Dr. Poole, Sharon Ottenbreit and the three fifth grade teachers assessed
the tangram problems, in order to assure the test was at an appropriate level. The test
assessed the different methods students used solving hands-on problems requiring
manipulation, the amount of time taken to solve the problem and the number of
attempts made by the student. The researcher compared data on methods students
used to solve problems prior to, as opposed to after the training. The researcher also
compared data between the control and experimental groups to see if there was any
Survey
own problem solving abilities and methods they consciously apply to problems. The
survey provided information on how students solve problems prior to the training.
The surveys also provided a comparison for the group interviews, which followed the
training serving as a post test. However, due to time constraints, all of the students
could not complete the surveys. The data was part of the process to better design the
technology programs. The survey has been modified from "Student Thinking About
The survey was a last priority for pre-assessment and was given to the students
if they were able to complete all the tests within the first two days of the project.
Therefore, not all students were able to complete the surveys. The survey consisted of
48
ten questions. The first seven questions used the Likert scale to evaluate how
students felt about problem solving and their problem solving ability. The last three
questions were short response answers concerning methods they use to solve
problems. Surveys included the cover sheet consisting of the student's name and
assigned ID number; only randomly assigned ID number was on the actual survey.
Only the researchers had access to the name/ID list. When finished with the survey,
the students returned the survey to the researcher. The sample survey, as well as the
Group Interview
constraints. The interview was originally planned as a form of post assessment to the
survey, but since all students did not complete the survey, answers were combined
from the survey and interview to create a large database of information. Students
from the control and experimental group were both present in the same room, at the
same time. Each student was supplied with an interview sheet, pencil and clipboard
so the interview could be conducted in a circle, on a floor rug. The researcher read
each question aloud, and after every child was finished, the students shared their
responses with the group. The group interview was videotaped in order to collect all
transcribed the videotape at a later time. As in the survey, the first seven questions
were based on a rating likert-scale. The last three questions were short response
Troubleshooting Activity
data. In the troubleshooting activity, the students worked in teams of two on the
stations. There were seven stations total for the students to complete. At each
station, there was a different problem the students needed to identify, solve/fix and
check to see if they accomplished the solution. The students contacted the researcher
to verify the completion of a station and the station was then prepared for the next set
of students. Each station was equipped with a worksheet for students to record their
answers. These worksheets were collected for data. Examples of the worksheets, as
observations from the instructor helped analyze the level of troubleshooting skills.
Data Analysis
The data from the standardized tests scores and hands-on problem solving
time scores were analyzed along with other factoring variables in order to correctly
analyze the data. The results of test scores form a two-by-two design that was
analyzed with a Paired-Sample T-Test. The SPSS program was used to compute
50
analyze the data. The results of test scores form a two-by-two design that was
analyzed with a Paired-Sample T-Test. The SPSS program was used to compute
were used to draw inferences from the collected data.· However, since samples were
so small the results were inconclusive. Comparisons between group scores and
student responses were analyzed for disparities and congruencies between their
perceptions and scores concerning problem solving. Variables measured through the
observation, survey and interview data were also analyzed in conjunction with data
from the rest of the database. The ordinal data collected from the hands-on problem
solving test scores was analyzed. Data was compared between the control group
were used to gather information on student problem solving methods and processes.
Through the addition of extensive analysis, the researcher expected to be able to make
Hypothesis 1
The first hyp othesis was that elementary students would acquire the ability to
hypothesis, three separate collections of data were analyzed. The first data collected
and analyzed for hyp othesis one was the Troubleshooting Activity. The researcher
divided the students up into teams of two and documented the teams as they
51
attempted to solve the different computer station problems. The researcher
documented whether students were able to solve the common computer problems at
each station.
Hypothesis 2
compared to student who did not participate. In order to evaluate this hypothesis, two
separate collections of data were analyzed. The first data collected and analyzed was
the POPS - Profiles of Problem Solving Methods Used section. The research
compared pre-test and post-test scores to establish the improvement of each student,
as well as average improvements for each group. The researcher also analyzed the
hands-on problem solving test, including number of attempts, time completion of the
easy problem and the number of students in each group able to solve the different
problem.
Hypothesis 3
The research assessed the most difficult procedure in problem solving for
elementary students by using findings from the Profiles of Problem Solving test.
Each category of the problem solving process was separately analyzed by observing
pre-test scores of the students. The second source of findings used to discover the
most difficult problem solving procedure was the survey/group interview. The
researcher grouped the responses into categories based on key phrases and evaluated
Information was collected from the rowA Basic Skills math problem solving
and data interpretation test and the group interview, in order to analyze whether
mathematical ability would be affected by the training sessions. The rowA test
results were analyzed for the average scores of each group. The group interview
Hypothesis 5
test and the hands-on problem solving test. The POPS pre-test and post-test total
scores were compared between genders, as well as each category of the POPS test.
Additional results were collected from the hands-on problem solving test. The
researcher compared the completion time for the easy problem from the pre-test to the
post-test. The findings for the number of attempts to solve each problem were also
studied from the hands-on problem solving test to analyze the affects of gender on the
Hypothesis 6
solving abilities were compared to evaluate if the variable was a significant factor.
Average improvements in total POPS score for each group were analyzed. The
hands-on problem solving test was analyzed for improvement in time completion of
the easy problem and the increase in the average number of attempts between the pre
test and the post-test. The number of students able to solve the difficult problem was
53
also analyzed.
Hypothesis 7
Information from the Profiles of Problem Solving test, the hands-on problem
solving test and the survey/interview were all used to investigate the final hypothesis.
The mean total POPS score was averaged for each group, comparing the pre-test total
score to the post-test total score. The hands-on problem solving test results were
of the easy problem, the average number of attempts in both problems from the pre
test to the post-test, and the percentage of students able to solve the difficult problem.
The last collection of data analyzed to address the hypothesis was the final group
Summary
Table 3.1
FINDINGS
Introduction
over the course of two weeks in order to increase their knowledge of computer
troubleshooting, thereby improving their ability to solve problems. The prediction for
the research project was that over the course of training, the subjects would improve
in their ability to solve computer problems. It was further predicted that as the
subjects began to improve in their ability to solve computer problems, they would
also begin to improve in general problem solving ability. Finally, it was predicted
through the comparison of pre and post assessments, students would show evidence
solving achievement.
Each student's results are described in a brief profile below. The specific data
and information pertinent to each hypothesis is described later on in the chapter. The
information and a short summary of the student's results. Each student's summary
includes charts of their results on each form of assessment. The first chart describes
basic information about the student and their results on the IOWA Basic Skills math
The second table shows the student's results on the third part of the Hands-On
Table 4.lb
Pre: Attempts Post: Pre: Time Post: Time Pre: Solution Post: Solution
Attempts
How many How many How long it How long it Whether the student Whether the student
times the times the took the took the solved the problem solved the problem
attempted the attempted the complete the complete the incorrectly, or did or did not solve the
problem problem problem problem not solve the problem during the
again during again during during the pre- during the problem during the post-test
Table 4.lc
Pre: Attempts Post: Pre: Time Post: Time Pre: Solution Post: Solution
Attempts
How many How many How long it How long it Whether the student Whether the student
times the times the took the took the solved the problem solved the problem
attempted the attempted the complete the complete the incorrectly, or did incorrectly, or did
problem again problem again problem during problem during not solve the not solve the
during the during the the pre-test the post-test problem during the problem during the
The final table shows the POPS, Profiles of Problem Solving Test. The test is
broken up into five separate sections; Correctness of Answer (COA), Methods Used
(MU), Accuracy (A), Extracting Information (El) and Quality of Explanation (QE).
Each of these sections evaluated a different part of problem solving ability and the
A: Accuracy
COA: Pre COA: Post MU: Pre MU: Post A: Pre A: Post
The number of The number of The number of The number of The number of The number of
correct points correct points correct points on correct points correct points on correct points on
on the pre-test. on the post the pre-test. on the post-test. the pre-test. the post-test.
test.
EI: Pre EI: Post QE: Pre QE: Post Total: Pre Total: Post
The number of The number of The number of The number of The number of The number of
correct points correct points correct points on correct points correct points on correct points on
on the pre-test. on the post- the pre-test. on the post-test. all of the pre-tests. all of the post-
test. tests.
Female
Student 1 Control
Subject #1 was a female student in the control group, who was rated by her
teacher as a high ability problem solver. The student had good attendance throughout
59
the school year. The student was matched up with student number 6 in the
experimental group. The student was shy, quiet and reserved during all periods of
assessment. The student stayed constant in the pre and post assessments of the easy
hands-on problem solving test, solving both correctly. She also maintained the same
number of attempts and the amount of time to complete the difficult hands-on
problem solving tests, however, she solved the problem correctly during the post-test.
She was the only student to dramatically improve on the IOWA test. The student's
results on the POPS post-test increased in the Correctness of Answer, Accuracy and
Extracting Information sections, improving her total POPS score by four points.
Table 4.2a
Student Group Gender Teacher-Rated Problem Solving IOWA Pre IOWA Post IOWA Difference
Ability
C F High 14 21 7
Table 4.2b
Pre: Attempts Post: Attempts Pre: Time Post: Time Pre: Solution Post: Solution
Table 4.2c
Pre: Attempts Post: Attempts Pre: Time Post: Time Pre: Solution Post: Solution
A: Accuracy
COA: Pre COA: MU: Pre MU: A: Pre A: EI: EI: QE: Pre QE: Total: Pre Total: Post
2 4 4 3 4 4 6 2 2 14 18
Student 2 Experimental
Subject #2 was a female student in the experimental group, who was rated by
her teacher as a high ability problem solver. The student had good attendance
throughout the school year. The student was matched up with student number 5 in
the control group. The student was shy, quiet and polite during all periods of
activities. The student showed knowledge and understanding of the topic as the
training sessions proceeded. The student's results showed slight differences between
the pre and post easy hands-on problem solving test, solving both correctly. She
increased the number of attempts on the difficult problem during the post-test,
however, she was unable to solve the problem during either test. She was consistent
with all of the subjects, improving two points on the IOWA test. The student's
61
results on the POPS post-test increased in the Correctness of Answer, Methods
total score by ten points. This student's improvement in the POPS test was the most
dramatic of all of the subjects. Below is documentation of her scores throughout the
project.
Table 4.3a
Student Group Gender Teacher-Rated Problem Solving rowA Pre IOWA Post rowA Difference
Ability
2 E F High 21 23 2
Table 4.3b
Pre: Attempts Post: Attempts Pre: Time Post: Time Pre: Solution Post: Solution
Table 4.2c
Pre: Attempts Post: Attempts Pre: Time Post: Time Pre: Solution Post: Solution
A: Accuracy
COA: Pre COA: MU: Pre MU: A: Pre A: El: El: QE: Pre QE: Total: Pre Total: Post
8 11 8 12 8 8 6 8 5 6 35 45
Anecdotal Data:
When asked: Do you think that learning how to troubleshoot a computer helped you
Her Response:
"Yes. Because you had to figure out what the problem is and you have to think of the
Student 3 Control
Subject #3 was a female student in the control group, who was rated by her
teacher as a medium ability problem solver. The student had good attendance
throughout the school year. The student was matched up with student number 4 in
the experimental group. The student was shy, quiet and reserved during all periods of
assessment. The student quickly solved the easy hands-on problem solving post-test,
improving her completion time by 30 seconds. During both part of the post-test for
63
the hands-on problem solving test, the student had fewer attempts on the problem.
She also was able to solve the difficult hands-on problem solving tests during the
post-test. Her IOWA score improved two points, which was a typical result. The
Methods Used and Extracting Information sections, improving her total POPS score
Table 4.4a
Student Group Gender Teacher-Rated Problem Solving IOWA Pre IOWA Post IOWA Difference
Ability
3 C F Medium 22 24 2
Table 4.4b
Pre: Attempts Post: Attempts Pre: Time Post: Time Pre: Solution Post: Solution
Table 4.4c
Pre: Attempts Post: Attempts Pre: Time Post: Time Pre: Solution Post: Solution
A: Accuracy
COA: Pre COA: MU: Pre MU: A: Pre A: EI: El: QE: Pre QE: Total: Pre Total: Post
5 9 7 8 6 6 5 6 6 6 29 35
Student 4 Experimental
Subject #4 was a female student in the experimental group, who was rated by
her teacher as a medium-ability problem solver. The student had good attendance
throughout the school year. The student was matched up with student number 3 in
the control group. The student was outgoing, methodical and confident during all
training sessions. The student was occasionally too eager to participate in the hands
on segment of the training and constantly interrupted the instructor to ask questions.
The student stayed constant in both parts of the pre- and post- hands-on problem
solving test, solving both correctly. She also maintained similar number of attempts
and time on the both parts of the pre- and post- hands-on problem solving tests,
however, she took more time to solve the difficult part during the post-test. She did
65
not improve on the IOWA test. The student's results on the POPS test increased in
sections, while decreasing her scores on the Correctness of Answer section. She
improved her total POPS score by nine points. Below is documentation of her scores
Table 4.5a
Student Group Gender Teacher-Rated Problem-Solving IOWA Pre IOWA Post IOWA Difference
Ability
4 E F Medium 20 20 0
Table 4.5b
Pre: Attempts Post: Attempts Pre: Time Post: Time Pre: Solution Post: Solution
Table 4.5c
Pre: Attempts Post: Attempts Pre: Time Post: Time Pre: Solution Post: Solution
A: Accuracy
COA: Pre COA: MU: Pre MU: A: Pre A: EI: EI: QE: Pre QE: Total: Pre Total: Post
6 5 4 7 5 6 3 6 3 6 21 30
Anecdotal Data:
When asked: Do you think that learning how to troubleshoot a computer helped you
Her response:
"Yes. Because you learned how to fix things easier when we did the computer. I
think yes because we actually learned like, cause you didn't help us that much. You
just kind of took apart the computer and we had to think of all the parts that were
missing and stuff. We did it in like art. The printer wasn't working; it was the same
Student 5 Control
Subject #5 was a female student in the control group, who was rated by her
teacher as a high-ability problem solver. The student had good attendance throughout
the school year. The student was matched up with student number 2 in the control
67
group. The student was shy, quiet and reserved during all periods of assessment.
The student stayed constant in both sections of the pre- and post- hands-on problem
solving test. She did use a multiple attempt approach in the easy section of the post
hands-on problem solving test, decreasing her time by one minute and fifteen
seconds. Her IOWA score improved one point, which was a typical result. The
student's results on the POPS test increased in the Correctness of Answer and
total POPS score by one point. Below is documentation of her scores throughout the
project.
Table 4.6a
Student Group Gender Problem Solving Rating by IOWA Pre IOWA Post IOWA Difference
Teachers
5 C F High 21 22
Table 4.6b
Pre: Attempts Post: Attempts Pre: Time Post: Time Pre: Solution Post: Solution
Table 4.6c
Pre: Attempts Post: Attempts Pre: Time Post: Time Pre: Solution Post: Solution
A: Accuracy
COA: Pre COA: MU: Pre MU: A: Pre A: EI: EI: QE: Pre QE: Total: Pre Total: Post
5 6 6 6 5 8 6 6 5 2 27 28
Student 6 Experimental
Subject #6 was a female student in the experimental group, who was rated by
her teacher as a high-ability problem solver. The student had good attendance
throughout the school year. The student was matched up with student number 1 in
the control group. The student was shy, quiet and reserved during all periods of
assessment. During the training sessions, the student would not volunteer at first.
The instructor would ask student #6 for an answer, and usually the student would
know the answer. The student was meek, but willing to take part in the hands-on
aspect of the training, being easily pushed aside by other students. The student was
one of the most improved students in the pre- and post- hands-on problem solving
test. On both tests, she was able to improve her time and solved the difficult section
during the post-test. She improved two points on the IOWA test, which was a typical
result of the subjects. The student's results on the POPS test increased in the
Table 4.7a
Student Group Gender Problem Solving Rating by IOWA Pre rowA Post rowA Difference
Teachers
6 E F High 20 22 2
Table 4.7b
Pre: Attempts Post: Attempts Pre: Time Post: Time Pre: Solution Post: Solution
Table 4.7c
Pre: Attempts Post: Attempts Pre: Time Post: Time Pre: Solution Post: Solution
A: Accuracy
COA: Pre COA: MU: Pre MU: A: Pre A: EI: EI: QE: Pre QE: Total: Pre Total: Post
7 11 6 8 7 8 8 8 5 4 33 39
Anecdotal Data:
When asked: Do you think that learning how to troubleshoot a computer helped you
Her response:
Male
Student 7 Control
Subject #7 was a male student in the control group, who was rated by his
teacher as a low-ability problem solver. The student had good attendance throughout
the school year. The student was matched up with student number 12 in the
experimental group. The student was quiet, reserved and rushed through all periods
of assessment. The student's results were similar in the pre- and post- hands-on
problem solving test. He felt pressured by the other student completing before him
71
problem solving test. He felt pressured by the other student completing before him
and gave up on the difficult section of the test. He had more attempts during the pre
tests of both parts than in the post-tests. He notified the instructor he had finished,
when he had never completed the difficult problem. He improved on the IOWA test
by three points, which was a typical score with the subjects. The student's results on
the POPS test decreased in the Correctness of Answer, Methods Used, Accuracy and
Quality of Explanation sections, decreasing his total POPS score by six points. The
researcher believes the reason for the decrease is due to rushed efforts. Below is
Table 4.8a
Student Group Gender Problem Solving Rating by IOWA Pre IOWA Post IOWA Difference
Teachers
7 C M Low 14 17 3
Table 4.8b
Pre: Attempts Post: Attempts Pre: Time Post: Time Pre: Solution Post: Solution
Table 4.8c
Pre: Attempts Post: Attempts Pre: Time Post: Time Pre: Solution Post: Solution
A: Accuracy
COA: Pre COA: MU: Pre MU: A: Pre A: El: EI: QE: Pre QE: Total: Pre Total: Post
2 3 4 3 5 2 5 2 17 11
Student 8 Experimental
Subject #8 was a male student in the experimental group, who was rated by
his teacher as a high-ability problem solver. The student had good attendance
throughout the school year. The student was matched up with student number 9 in
the experimental group. The student was interested, quiet and well-behaved during
all periods of training and assessment. The student asked intelligent questions and
both sections of the post- hands-on problem solving test. He was able to solve the
problem in the difficult section of the post-test, which he was not able to successfully
complete in the pre-test. His completion time on both the easy section and the
difficult section also improved in the post-test. His score on the IOWA test did not
change, which was typical for all subjects. The student's results on the POPS test
Table 4.9a
Student Group Gender Problem Solving Rating by IOWA Pre IOWA Post IOWA Difference
Teachers
8 E M High 23 23 0
Table 4.9b
Pre: Attempts Post: Attempts Pre: Time Post: Time Pre: Solution Post: Solution
Table 4.9c
Pre: Attempts Post: Attempts Pre: Time Post: Time Pre: Solution Post: Solution
A: Accuracy
COA: Pre COA: MU: Pre MU: A: Pre A: El: El: QE: Pre QE: Total: Pre Total: Post
3 6 5 8 6 8 5 6 4 4 23 32
Anecdotal Data:
When asked: Do you think that learning how to troubleshoot a computer helped you
His response:
"Yes I think troubleshooting will make a difference because finding out what's wrong
with a computer is a lot like finding out what the solution is in a question. I had to
figure out what was wrong with the computer. And it was a lot like trying to figure
Student 9 Control
Subject #9 was a male student in the control group, who was rated by his
teacher as a high-ability problem solver. The student had good attendance throughout
the school year. The student was matched up with student number 8 in the
experimental group. The student was energetic, rambunctious and rushed during all
periods of assessment. The student had a good attitude and sought out attention in
75
many different forms. The student solved the easy section faster in post- hands-on
problem solving test than in the pre-test. The student solved the difficult section
problem correctly in the pre-test, but was unable to solve the problem in the post-test.
He received the same score on the IOWA pre-test as he did on the IOWA post-test.
The student's results on the POPS test decreased in every section, decreasing the total
POPS score by 8 points total. Below is documentation of his scores throughout the
project.
Table 4.10a
Student Group Gender Problem Solving Rating by IOWA Pre IOWA ·Post IOWA Difference
Teachers
9 C M High 20 20 0
Table 4.10b
Pre: Attempts Post: Attempts Pre: Time Post: Time Pre: Solution Post: Solution
Table 4.10c
Pre: Attempts Post: Attempts Pre: Time Post: Time Pre: Solution Post: Solution
A: Accuracy
COA: Pre COA: MU: Pre MU: A: Pre A: EI: EI: QE: Pre QE: Total: Pre Total: Post
9 8 8 7 8 6 6 5 4 35 27
Student 10 Experimental
Subject #10 was a male student in the experimental group, who was rated by
his teacher as a high-ability problem solver. The student had good attendance
throughout the school year. The student was matched up with student number 11 in
the control group. The student was unmotivated, quiet and reserved during all
periods of assessment and training. The student stayed consistent many aspects of
both sections of the pre- and post- hands-on problem solving test. He maintained
similar attempts, times and solved the easy section correctly in the pre and post-tests.
However, he solved the difficult section during the pre-test, but was unable to
complete the problem during the post-test. There was noise and distractions during
his hands-on problem solving post-test. He improved his IOWA score by 3 points,
which is a typical improvement. The student's results on the POPS test increased in
the Methods Used and the Quality of Explanation sections, while decreasing in the
77
Correctness of Answer sections. The student improved his total POPS score by
Table 4.1 la
Student Group Gender Problem Solving Rating by IOWA Pre IOWA Post IOWA Difference
Teachers
10 E M High 16 19 3
Table 4.1 lb
Pre: Attempts Post: Attempts Pre: Time Post: Time Pre: Solution Post: Solution
Table 4.llc
Pre: Attempts Post: Attempts Pre: Time Post: Time Pre: Solution Post: Solution
A: Accuracy
COA: Pre COA: MU: Pre MU: A: Pre A: EI: EI: QE: Pre QE: Total: Pre Total: Post
5 4 4 5 4 4 7 7 2 3 22 23
Student 11 Control
Subject #11 was a male student in the control group, who was rated by his
teacher as a high-ability problem solver. The student had good attendance throughout
the school year. The student was matched up with student number 10 in the
experimental group. The student was outgoing and well-behaved during all periods
of assessment. He was the first to finish every test. The student took less time to
complete the easy section of the hands-on problem solving test. He was unable to
solve the difficult problem in the pre- and post- hands-on problem solving test. He
improved his IOWA score by one point, which is typical of all the subjects. The
student's results on the POPS test increased in the Correctness of Answer and
Extracting Information sections, improving his total POPS score by five points.
Student Group Gender Problem Solving Rating by IOWA Pre IOWA Post IOWA Difference
Teachers
11 C M High 22 23
Table 4.12b
Pre: Attempts Post: Attempts Pre: Time Post: Time Pre: Solution Post: Solution
Table 4.12c
Pre: Attempts Post: Attempts Pre: Time Post: Time Pre: Solution Post: Solution
Table 4.12d
A: Accuracy
COA: Pre COA: MU: Pre MU: A: Pre A: El: EI: QE: Pre QE: Total: Pre Total: Post
7 10 7 7 8 8 6 8 3 3 31 36
80
Student 12 Experimental
Subject #12 was a male student in the experimental group, who was rated by
his teacher as a low-ability problem solver. The student had good attendance
throughout the school year. The student was matched up with student number 7 in
the control group. The student was quiet and reserved during all periods of
assessment and training. The student was motivated to begin the training and was
excited by the hands-on aspect of the training. The student was classified as Leaming
Disabled. The student improved his time in the easy section of the hands-on problem
solving test. He was also able to solve the difficult problem correctly during the
post-test, as opposed to the pre-test. He improved his score on the IOWA test by
three points, which is typical of the students within this study. The student's results
on the POPS test increased in the Methods Used and Accuracy sections, but
overall POPS total score decreased by one point. Below is documentation of his
Table 4.13a
Student Group Gender Problem Solving Rating by IOWA Pre IOWA Post IOWA Difference
Teachers
12 E M Low 16 19 3
Table 4.13b
Pre: Attempts Post: Attempts Pre: Time Post: Time Pre: Solution Post: Solution
81
.5 Solved Correctly Solved Correctly
Table 4.13c
Pre: Attempts Post: Attempts Pre: Time Post: Time Pre: Solution Post: Solution
Table 4.13d
A: Accuracy
COA: Pre COA: MU: Pre MU: A: Pre A: EI: EI: QE: Pre QE: Total: Pre Total: Post
5 2 3 5 3 4 6 5 18 17
Hypothesis 1
station, there was a separate computer problem the students were required to solve.
The students were separated into teams of two and were given a sheet for each
station. The student would then assess the problem, fix the problem and describe the
solution on the worksheet. The students solved all problems they encountered (See
Table 4.14).
Table 4.14
Station #1
At station #1, titled "Printer on the Mac", the printer was the main source of
the problem. The specific problems with the printer were: the printer was lacking
paper, the power was turned off, the power cable was not connected to the printer, the
cable from computer to the printer was not connected and the power cord was not
plugged into the power strip. Teams wrote the following responses:
84
Team #1: "Plug in printer and put in paper."
Team #2: First it wasn't plugged in and the USB wasn't plugged in. There was no
Team #3: "What's wrong with your printer is the plug wasn't in and there was no
All three teams solved all the problems successfully without guidance from
the instructor. All teams printed a document after correcting the problems, and
Station #2
At station #2, titled "MAC Number Two", the students were required to install
a program. However, the mouse and keyboard were both unplugged. The students
needed to first correct this problem, and then move on to install the program. Each
group received a different program to install because the removal of each program
would have taken too much time. Teams wrote the following responses:
Team #1: "Plug in any plugs, put in CD and pushed yes, You put the CD in and
Team #2: "One problem we had was the mouse was not plugged in. We first went to
installer then we pushed continue and it installed them. We restarted the computer."
Station #3
At station #3, titled "Open Box with a Black Screen", the students were
required to fix minor problems with the monitor. The monitor power cable was
unplugged, the power button on the monitor was turned off, and the monitor cable
was not plugged into the system tower. Teams wrote the following responses:
Team #1: "Monitor won't tum on because it had no power and it was not plugged in
Team #2: "The monitor is not working. It is not working because the [monitor] is
Team #3: "[The problem is the] monitor won't tum on because the power cable isn't
All three teams solved all the problems correctly without guidance or
clarification from the instructor. All teams successfully turned on the monitor after
correcting the problems, and presented the lighted screen to the instructor.
. Station #4
At station #4, titled "What's Wrong with this Box?", the students were
required to look at powerless system tower, with the cover taken off. They were
obligated to look over the entire system to find which components were missing or
86
unattached. The mouse trackball was removed, the mouse was unplugged from the
system tower, one of the RAM memory pieces was removed, the monitor was
unplugged from system tower, the power button was removed and the sound cord was
unplugged from sound card. Teams responded with the following responses:
Team #1: "The sound cable is not plugged in. The ram is missing. The trackball for
the mouse isn't in. Plug in [the] mouse, keyboard and monitor. The power cable is
Team #2: "The ram is missing (1). The wire is not plugged in. The p5 ( the internal
power cord) wire is not plugged in. The [monitor] is not plugged into the power
tower. The mouse is not plugged in. The trackball is missing. The power button is
Team #3: "Sound cable. More ram. Trackball. Plug in mouse and keyboard and
Students were also asked if the computer was turned on, how the system
would be affected. Team #2 did not respond. Other teams wrote the following
responses:
All three teams solved all the problems without guidance or clarification from
the instructor. All teams successfully reassembled the computer by asking the
instructor for each part missing. The students physically installed the missing RAM
87
and plug in the sound cord to the sound card, as well as attach all missing
peripheral devices.
Station #5
At station #5, titled "EXAMPLE BOX", students were just provided with an
example computer set-up, in case they wished to use it for an example. Team #1 was
the only team to visit this station to look over the example.
Station #6
At station #6, titled "Laptop Trauma", students were required to perform two
tasks. First, students needed to find a missing file titled "Lost Dog", by using the
search function within the Windows Operating System. The file was taken off the
Recent Documents menu to ensure the students were using the search function.
Team #2: "Yes we found it. We went to search and pushed files and folders then we
picture on the desktop. Students needed to use the properties menu by right clicking
on the desktop and selecting properties. Within the Display Properties, students
Team # 1: "You right click anywhere then you click properties. Then you go to
Team #2: "Yes. We right clicked then we went to properties. Then we clicked on
All three teams found the missing file and change the desktop background
without guidance from the instructor. All teams used the search function within the
Windows Operating System to find the missing file. All teams changed the
Station #7
At station #7, titled "Trouble with Laptop Printing", students were asked if
they could print from a laptop by simply plugging the printer into the laptop. There
was no printer software installed on the desktop, so the laptop would not have been
able to print from the printer. Teams wrote the following responses:
Team #2: "No because after you hooked it up to the laptop you need to install it."
Team #3: Students visited this station and successfully solved the problem, but did
not respond.
89
All three teams solved all the problems correctly without guidance or
clarification from the instructor. The students all solved this problem together due to
Out of all the problems students attempted during the final troubleshooting
activity, students solved every problem successfully without aid from the instructor.
solving process. The teams identified the common computer problems, and wrote the
problems out on their team worksheet. They then proceeded to devise a plan or
reattaching devices or installing components. They were able to look back and verify
their answers by accomplishing the task and receive feedback from the fixed
machine. The computer was able to provide automatic feedback, as to whether the
problem was solved. Students solved all common computer problems presented by
Group Interview
During the group interview, students solved a computer problem for a teacher
within one day of the troubleshooting activity. The students were asked if they
believed they could solve common computer problems. One student in the
experimental group commented "we [solved a common computer problem] in like art.
The printer wasn't working; it was the same problem here. So we pressed the
[power] button and [the printer] worked." The students in the experimental group
told the control group about the different computer troubleshooting stations they were
RESEARCHER Why don't you guys try to tell them a little bit about what we
did?
#4 (experimental) The first couple days we were just studying like what parts of
the computers were the computers. We took apart the
computers and um ... and put them back together. Then like the
last day, she took apart a computer and we had to put it back
together with all the parts.
#6 (experimental) We had to go to like stations and we had to figure out what it
was and fix it.
RESEARCHER You had to fix it. You had to figure out what it was first and
then you had to fix it. So you had to identify the problem.
#2 (experimental) We had to install and uninstall a program.
RESEARCHER Did the people who went through the computer troubleshooting,
did you guys have fun doing that?
All Yeah.
RESEARCHER Do you think that if your teacher had a problem with the
computer that you could fix it?
All Yeah.
RESEARCHER So now you can help your teacher out in lab?
#4 (experimental) We did it in like art. The printer wasn't working; it was the
same problem here. So we pressed the button and it worked.
RESEARCHER Alright there you go. That's fantastic guys! Thanks a lot. I
really appreciate it.
during group discussions, and were anxious to solve the problems, getting their hands
91
on the hardware and the computers. Students encountered all problems with
Hypothesis 2
group.
Within the POPS test, there were five categories to assess the different
elements of problem solving. According to the POPS teacher's manual, the method
used category contained activities related to problem solving strategies such as;
generalizing. The control group table (Table 4.16) shows the difference between pre
and post scores (See Figure 4.1). The experimental group table (Table 4.17) shows
the improvement between pre and post scores (See Figure 4.2).
Table 4.16
Student Methods Used Pre-Test Methods Used Post-Test Methods Used Score
Score Score Difference
#1 4 4 0
#3 7 8 1
#5 6 6 0
#7 4 3 -1
#9 8 7 -1
#11 7 7 0
AVERAGE 6 5.833333 -0.16667
92
Figure 4.1
10 -....,....,__,__,__.......,.,,.,....._______,,.._____...,..,,.,........._......,
8+------
Table 4.17
Student Methods Used Pre-Test Methods Used Post-Test Methods Used Score
Score Score Difference
#2 8 12 4
#4 4 7 3
#6 6 8 2
#8 5 8 3
#10 4 5
#12 3 5 2
AVERAGE 5 7.5 2.5
93
Figure 4.2
14
12
10
□ Methods Used A"e-Test
■ Methods Used R::lst-Test
4
2
4 6 8 10 12
Student
Only one student, student #3, in the control group improved in the methods
used category. All other control group students either remained constant or decreased
their score. The experimental students all improved their score in the methods used
improved 2.5 points in the methods used category between the pre- and post-tests.
Students in the control group, on average, received .17 less points on the post-test, as
To evaluate the methods students used, the hands-on problem solving activity
was videotaped and assessed at a later time. The number of attempts for each
problem during the pre-tests and the post-tests were analyzed, as well as the time
attempted to solve the problem was also recorded from the videotape.
94
Figure 4.3
� 3 _,_,,,_,,,__..,....,..._,,,,,,,_,,,..,,._,.,,,,__,_,...__.....,.....__
6 --,-..,.,,.,..,...,.,.,..,,_.....,.......,.,.,,.,,_,..........,.,,...,.,,,..........,.......,,.,,..,........,_,..............,,,,......,.,.
R5--i-----��----------�
s
� 4+""----��"-1.:=-----r----l �Control Group Average
Attempts
'o 3
-Experimental Group
-.i---------,.......,....-----=----....-1
2
"a::i Average At tempts
II)
z
l
0 --1---.................----�,................ �--�----1
ATTEMPT ATTEMPT
The control group on average (See Figure 4.3) showed a 49% decrease in
number of attempts from the easy problem pre-test to the easy problem post-test. The
control group on average also decreased in number of attempts by 55% from the
difficult problem pre-test to the difficult problem post-test. The experimental group
95
on average increased the number of attempts by 13% on the easy problem pre-test
to the easy problem post-test. The experimental group on average also increased in
number of attempts on the difficult problem from the pre-test to the post-test by 23%.
All students solved the easy problem in less than the specified five-minute
time period; therefore time comparisons can be made between groups (See Table
4.18).
Figure 4.4
Time Results from the Easy Problem in the Hands-on Pre- and Post-Test
3
j 2.5
:i
·e
C:
...,_Pre-Test
1 5
C: -Post-Test
'; 1
i= 0.5
Control Experimental
Group
Table 4.18
Time Results from the Easy Problem in the Hands-on Pre- and Post-Test
and the experimental group's average improvement in time was 0.042 minutes,
amounting to 2.8 seconds. Due to the small numbers, the difference is not significant.
The difficult problem was much more complicated and many students were
unable to solve the problem. The following table shows students who completed the
difficult problem during either the pre-test, post-test or both as indicated below (See
Table 4.19).
Table 4.19
Control Students' Pre-Test and Post-Test Ability to Solve the Difficult Problem in the
Hands-On Problem Solving Test
Table 4.20
Experimental Students' Pre-Test and Post-Test Ability to Solve the Difficult Problem
in the Hands-On Problem Solving Test
Figure 4.5
Comparing the Number of Students in Each Group not Able to Solve Correctly or
Solve Incorrectly the Difficult Problem in the Hands-On Problem Solving Test
0 3-
lii 2 Incorrectly
E 1
:i O -J---.io------
Control Experimental
Group
Figure 4.6
Comparing the Number of Students in Each Group Able to Correctly Solve the
Difficult Problem in the Hands-On Problem Solving Test
- 4+------------
�c3+---ai=-----�--,-,..-_,_.
0 Cl)
EJ R"e-Test: Solved Correctly
1l �
E:::s -:::,2-i----- ■ F\Jst-Test: Solved Correctly
z "'1 -t--::!1-�
0 +--..........................
Control Experimental
Group
Four students in the experimental group solved the difficult problem correctly
during the post-test, whereas only two students in the control group solved the
98
difficult problem during the post-test. The results are not significant because two
students who solved the problem correctly during the pre-test, were unable to solve
the problem during the post-test. Each group contained one student who solved the
problem correctly during the pre-test, but not during the post-test. The control group
Hypothesis 3
The most difficult part of problem solving was analyzed through a review of
literature and articles. Data was also collected through the Profiles of Problem
Solving test and the surveys/interviews. The standardized problem solving test,
POPS, divided the evaluation into five separate categories; Correctness of Answer,
category was present in multiple questions and graded separately for each student.
Every category was divided into three possible levels of achievement; beginning,
developing and advanced. Each student was graded on the pre-test and post-test,
Each Student's Pre-Test Score on the POPS test Graded on Beginning, Developing or
Advanced Levels of Achievement
The most difficult categories for the students were, in order of difficulty;
there were more students who were in the beginning or developing stages.
Survey/Group Interview
During the survey/group interview, students were asked, "What is the hardest
part about solving a problem?" Written responses and verbally expressed opinions
100
were both recorded and organized (Table 4.22) showing the different difficulties
information, understanding what the question is looking for, what to do with the
information, looking back, not enough information and the type of strategy or plan to
use.
Table 4.22
simplify the presentation of the data. Behind each title were key responses or key
words used to define the category. Behind identifying important information, key
phrases such as, "I don't know what to do with the given information", "what
info[rmation] is needed to solve the problem", and "finding all the information"
defined the category. Behind understanding the question, key phrases such as, "don't
understand it", "don't know what the question is asking you to do", and "what the
problem is looking for" defined the category. Behind looking back, key phrases such
as, "knowing if your done" and "finding out the answer" defined the category.
Behind lack of information, key phrases such as "not enough information" defined
the category. Behind method or strategy, key phrases such as, "making a plan", "how
Hypothesis 4
solving ability.
IOWA Test
Students in the experimental group and the control group both were pre-tested
and post-tested using the IOWA Basic Skills math problem solving and data
interpretation 26-item section of the IOWA test. The students were given the
Figure 4.7
IOWA Scores Compared Between Mean Group Scores on the Pre-Test versus the
Post-Test
25
20
""''
\'i"''"'iffi
A
�· '" ;y• ,,,. - - �
-
..
••··
... ...
15
10
5
0
Pre: IOWA Post: IOWA
Average Control 19.83 22.17
I
I
. :-:-:-:. :-:-:-:-:•Average Experimental 20.3 22
The mean scores of both groups produced similar results. While students in
the control group improved 2.3 points, the experimental group improved 2.3 points
102
from the pre-test to the post-test. The results provided identical and therefore
Group Interview
Students made references to the requirement to "figure out what the problem is" and
students to figure out what the problem is and brainstorm what the solution could be
(Paris, 2000). Another student stated that, "finding out [what's] wrong with a
computer is a lot like finding out what the solution is in a question." Another student
wrote, "I think [computer troubleshooting] helped [my problem solving skills] by
learning strategies like in math," directly showing the similarity between computer
Hypothesis 5
There will be no effect on problem solving ability when gender 1s taken into
consideration.
The existence of the gender gap in computer usage has been shown through
(Burge, 2001). Students were separated by gender in multiple ways, and data was
collected, separated and analyzed by gender. The POPS pre-tests and post-tests total
103
scores were compared, as well as all category pre-tests and post-tests. The hands-
Data analyzed from the POPS total score pre-test and post-test showed
females improving from the pre-test with an average score of 26.5 points to the post
test with an average score of 32.5 points. The males did not improve and retained a
constant average score of 24.3 points through the pre- and post-test.
Table 4.23
Mean of POPS Pre-Test vs. Post-Test Total Score Comparing Females vs. Males
Data analyzed from the POPS Methods Used section and the Extracting
Information section showed an average female improvement from the pre-test to the
post-test. On average, males also improved; however, the results were not as
significant. The females increased their average score by 22% on the methods used
section, and the males increased their average methods used score by 11%. The
and the males increased their average extracting information score by 3%.
104
Figure 4.8
Mean of POPS Pre-Test vs. Post-Test Total Score Comparing Females vs. Males
� 30
0
28 ---+--- Male Average
26 -a- Female Average
0
24
22
20
POPS: Total: Pre POPS: Total: Post
Table 4.24
Comparison of the Methods Used and Extracting Information Categories of the POPS
Test Between Mean Gender Score
Gender differences were also exhibited within groups. The female control
group improved 14% in their total POPS score, while the male control group
decreased their total POPS score by 11%. The female experimental group improved
their total POPS score by 22%, while the male experimental group improved their
Comparison of the Methods Used and Extracting Information Categories of the POPS
Test Between Mean Gender Score
8
7.5
7
en 6.5 -+- Males Average
Methods Used Score
·o 6
5.5 ---- Females Average
Methods Used Score
5
4.5
4
POPS: MU: Pre POPS: MU: Post
Table 4.25
Gender Comparisons Divided by Group of Total POPS Score on the Pre- and Post
Tests
0.25
0.2
0.15
Percentage of
Improvement 0.1
Between Pre- 0.05
Test and Post- 0
Test
-0.05
-0.1
-0.15 Awrage
Female Female A\erage
Control E,cperimental Female Male
Series1 0.14 0.22 0.18 -0.11 0.12 0.005
On average, the females increased their total POPS scores by 18% between
the pre-test and the post-test, whereas the males were only able to increase their total
POPS score by 0.5% on average. When gender was analyzed by groups, females in
the experimental group improved on a larger scale than the females in the control
group. Likewise, in the male gender, when separated by groups, males in the
experimental group improved, whereas males in the control group decreased in their
The average score for the females was higher than the males in every category
within the POPS test. Females outscored the males in number of points improved in
7
.
5
7 +---"____ ____.._.........,..�;g;;:;..-� -+- Males Average
Correctness of
r=��=====:::::J
6 +---------,..---------!
.
� 5 -,,,.-.
6 t---------------1 Answer Score
-a- Females Average
C:
.
·o
C. 555 Correctness of
4.5 +----------------! Answer Score
4 +---"""""--"""""-..,................---�
POPS: COA Pre POPS: COA Post
Figure 4.12
----------
Difference Between POPS Pre-Assessment and Post-Assessment Score on the
Accuracy Category Separated by Gender
7 T""'��==--"""ilE--"'TI'lj
-
6.5 +----"--------i---i----::-,.,,.,,,,:.;::.....c.;.;._____
6 ------- -+- Males Average
....�
Cl)
4.5
4 +----------------- -+- Males Average
Quality of Explanation
-i------.._;;;.....,.�-------------------� ---
Cl)
3.5 Score
Q. 3 Females Average
Quality of Explanation
2.5 Score
2
POPS: QE: Pre POPS: QE: Post
Throughout the POPS test categories, females improved more than the males
(See Figure 4.11, 4.12, 4.13). Within the experimental group, the female
experimental subjects improved more than the male experimental subjects. The
female experimental subjects improved in their total POPS score from the pre-test to
the post-test by 22%, while the male experimental subjects only improved by 10%.
Both genders in the experimental group outscored their peers in the control group,
showing the training made an impact regardless of gender. However, females in the
control group and the experimental group were both improved their total POPS score
average, although the experimental group improved 8% more than the control group.
Males in the control group decreased their total POPS score, while the males in the
experimental group achieved improvements in their total POPS score. The males in
109
the experimental group improved 20% more than their male counterparts in the
control group.
The hands-on problem solving test was designed to examine the difference of
Figure 4.14
Experimental Males versus Control Males Time to Complete the Easy Problem in the
Hands-On Pre-Test and Post-Test
3.25 -��..,..,.,.,,,.111111!��
Time (Mi ut,s) --+- Control Males Time
2.75 +----1..---J.....,_�-----"---------'"l to Complete the Easy
Problem in the
Hands-on Test
--- Experimental Males
Time to Complete the
Easy Problem in the
Hands-on Test
Males in the experimental group improved their time at a greater interval than
their counterparts in the control group. Although both groups improved in time, the
control males improved an average of approximately six seconds between the pre-test
and post-test. Whereas the experimental males improved an average of thirty seconds
Experimental Females versus Control Females Time to Complete the Easy Problem
in the Hands-On Pre-Test and Post-Test
2
1.8
1.6 � Control Females lime
: 1.4 - to Complete the Easy
Problem in the Hands
] 1.2
on Test
:i 1
-a- Experimental Females
E o.8
i= 0.6
lime to Complete the
Easy Problem in the
0.4 Hands-on
0.2
0
Pre-Test lime Post-Test lime
However, the females in the experimental group did not improve as much as
the females in the control group. The control group improved almost thirty seconds,
whereas the experimental females only improved twelve seconds. This shows that
the comparison of time is inconclusive and with small numbers, such fluctuation of
data shows the females in the experimental group received no additional growth in
Time was not the only measure of improvement in the hands-on problem
solving test. The number of attempts was an additional measure of assessment used.
111
Figure 4.16
Difference in Number of Attempts Between the Easy Problem in the Hands-On Pre
Test and Post-Tests Separated by Groups and Gender
Figure 4.17
Difference in Number of Attempts between Difficult Pre-Tests and Post-Tests Separated by Groups and
Genders
test, except the male experimental group. In both the easy problem and the difficult
problem, the male students in the experimental group were the only students to
increase their attempts in either problem. All other groups decreased their attempts
Overall, the data resulting from the hands-on problem solving test shows the
experimental males improving more than their male peers in the control group and the
females in both groups. The male experimental group also used more attempts to
solve the difficult problem than the males in the control group. Overall, the most
improvement in time completion on the easy problem was shown by the female
control group, improving by more than 30 seconds on average. The female groups
also saw a large decrease in number of attempts through both problems. The hands
on problem solving test was not substantial evidence to prove the theory either way.
Using all forms of assessment, POPS produced the most constant and obvious
results. The male results were much less stable than their female counterparts.
However, using the POPS tests, gender was analyzed and resulted with two main
findings. Females scored higher on the pre-tests and post-tests than the males,.
however, males in the experimental group improved on a greater interval than their
female counterparts.
Hypothesis 6
Students rated as high problem solving ability by their teachers will improve
their problem solving ability within the experimental group as opposed to the control
group.
113
Students in the experimental group and control group were matched by
how their teachers rated their problem solving ability. When students were divided
into teacher-rated problem solving groups, the results presented a different angle on
improvements. Overall, students in the low problem solving group showed little
increase in scores, and in numerous cases, decreased their score in the post
assessment.
analyzed using data from the POPS test and the Hands-on problem solving test to
The POPS test was the first form of assessment used to see whether there was
any difference between the teacher-rated problem solving ability groups. The total
POPS pre-test score and post-test score was analyzed, as well as each individual
The results were divided by the teacher-rated problem solving ability level
and provided interesting results. The researcher found the low problem solving
ability group often decreased their scores. The two students rated as low problem
solvers both rushed through all post-assessments and could be a possibility for the
decrease. In the figure above (Figure 4.18), the graph shows the average increase of
the high and medium experimental group, which improves at a greater interval than
the average of the high and medium control group. The experimental group also
POPS Total Score Pre-Test and Post-Test Divided by Teacher Rated Problem Solving
Ability and Groups
31
2a
-·����;:�::=:�r==��;;i;;::;:�;�q
w�:.:.::1---����2-----i
25 ��di���-___,,,,:;:__..,...._
Figure 4.19
POPS Total Score Pre-Test and Post-Test High Teacher Rated Problem Solving
Ability Separated by Groups
-
34.8
35
r:: 33 - �High Problem Solving
-
0 31 Ability Control Group
iii 29
27 - -High Problem Solving
25 Ability Experimental
POPS Total Score POPS Total Score Group----�
� -
Pre-Test Post-Test
115
By analyzing group by group, the experimental group showed more results
than the control group. The high problem solving ability control group increase at a
very minimal level. The experimental high problem solving ability group increased
Figure 4.20
POPS Total Score Pre-Test and Post-Test Medium Teacher Rated Problem Solving
Ability Separated by Groups
36 --,,_,,,.-,,,,,,=---------�
;,,,<,!
34 +-.......,{-...--------::a__,
c
rn
32 -.-Medium Problem
o 30 Sol.,,;ng Ability
a. 28
ns 26 Control Group
o
t-
24
22
--- Medium Problem
20 Sol.,,;ng Ability
POPS Total Score POPS Total Score Experimental Group
Pre-Test Post-Test
The difference between the control and experimental medium problem solving
ability subjects was not as extreme as the high problem solving ability groups.
However, the experimental medium problem solving subject improved by nine points,
or 30%, while the control subject improved by six points, or only 18%.
116
Figure 4.21
POPS Total Score Pre-Test and Post-Test Low Teacher Rated Problem Solving
Ability Separated by Groups
POP Total Score Pre-Test vs. Post-Test Low Teacher Rated Problem Solving Ability
Separated by Groups
16
The low problem solving ability group results also presented better results in
the experimental subject as opposed to the control subject. The low problem solving
score of 17, while the control student's score decreased from a 17 to a total post-test
score of 11. Therefore, the experimental student did not produce as large of a drop as
the control student in the low teacher rated problem solving ability group.
Students were separated into different problem solving ability groups for
comparison purposes, to create equality among the sample groups. However, when
analyzed data showed that the high and medium experimental group improved over
117
the high and medium experimental group. The low experimental group also
produced more improvements on scores than the low control subject; however, since
the data was collected from only 2 students, the results are unreliable.
When separated by problem solving ability level, the data results in the hands
on problem solving test remained similar to the overall results of the study. Students
in all groups improved their time to complete the easy problem at nearly the exact
Figure 4.22
Time Completion Compared by Problem Solving Ability for the Easy Problem in the
Hands-On Problem Solving Pre-Test versus Post-Test
E
Q) 3.5
0
a. 3
co - 2.5 lgh E)<perimental
w Q)
"'
Q) - roup
- :::s 2 ITT
Medium Group
a.·-
-
Q) C
E� 1.5
0
-
u Low Group
0 1
Q)
0.5
Average Time to Solve Average Time to Solve
Pre-Test Post-Test
118
Table 4.26
Time Completion Compared by Problem Solving Ability for the Easy Problem in the
Hands-On Problem Solving Pre-Test versus Post-Test
improvement was constant between all groups. However, the number of attempts
students used to solve the problems varied greatly. Students in the high experimental
group were the only subjects to increase their average number of attempts from the
pre-test to the post-test, while still matching the time improvement rate of the other
groups.
All other groups, besides the high experimental group, decreased their number
seconds. The high experimental group also maintained the same time improvement
more attempts in their post-test than in their pre-test. The high control group,
medium group and low group all decreased in their number of attempts by an average
of .5 attempts or more.
119
Figure 4.23
Number of Attempts on the Easy Problem in the Hands-on Pre-Test versus Post
Number of Attempts on the Easy Problem Hands-On Test Comparing Pre-Test vs. Post-Test
Scores Across Teacher Rated Problem Solving Ability Groups
3.5 -
� Medium Group
i 2-i-------------�.,--
7�---------11::;:�Lo�w�G�ro�upe____ __
_j
z
a,
Table 4.27
hands-on problem solving test. The percentage of students within their teacher-rated
problem solving ability groups who solved the difficult problem in the hands-on pre
test versus the post-test is shown above (See Table 4.27). There was no large
improvement from any groups because the groups contained such small numbers.
However, students in the medium and low problem solving groups had more success
with solving the hands-on problem solving test. While only 38% of all the high
problem solving ability students solved the difficult problem in less than five minutes
during the post-test, 75% of the students in the low and medium problem solving
ability groups solved the difficult problem. The low and medium problem solving
ability groups could have benefited from the hands-on manipulation of solving the
problem.
When separated by problem solving ability level, the data results remained
similar to the overall results of the study. Overall, the students in the experimental
groups outperformed the students in the control groups at all levels of problem
solving ability. The experimental students rated high and medium showed more
Hypothesis 7
The results from the numerous assessments show the experimental group
improving more than the control group. Information was taken from the POPS test,
the hands-on problem solving test and the group interview in order to evaluate
121
whether students in the experimental group achieved higher results in the post
improvement of each group. The categories which were the most focused on in
problem solving were the methods used section and extracting information. Although
all categories are useful in problem solving, and the total score was analyzed,
Figure 4.24
Difference in Total POPS Score Between the Pre-Test and Post-Test of Average
Experimental Group versus Average Control Group
=
•I
Ill
30 +-----------------'---------'-!
Ill
� 29F-=----------.;......-�<.------�-�
---Average ExperimentalGroup
-AverageControlGroup
•�
� 2e-i--,..,....,..__________,______._..,_____
: 27 +-----------------------
As seen in the information provided above (See Figure 4.28), the total POPS
score, which analyzed the overall improvement of the students' ability to solve
122
problems, improved more in the experimental group, on average, as opposed to
the control group's average score. Both groups began relatively at the same level.
The control group started out with an average pre-test score of 25.5, while the
experimental group started out with an average pre-test score of 25.333. However,
the improvements did not remain constant between the groups. The average
per student. The average control group post-test score reached 25.83 points,
improving an average of .3 points per student. Therefore, the experimental group was
able to improve an average of 5.3 points per student more than the control group
experimental students improved at a greater rate than the control students. The most
important categories to the project were the methods used and extracting information.
The accuracy, correctness of answer and quality of explanation were less important to
the researcher and were not the focus of the project. The first category analyzed was
the methods used category. The methods used category was analyzed earlier in
hypothesis two, showing the large difference between the experimental group's
average improvement and the control group's average improvement between pre-test
and post-test (See Figure 4.1 and 4.2). The students in the experimental group
improved an average of 2.5 points by increasing their average pre-test score of 5.0
points to an average post-test score of 7.5 points. The students in the control group
actually dropped their average score 0.17 points, descending from an average pre-test
score of 6.0 points to an average post-test score of 5.83 points. Therefore, the
123
experimental group showed an average improvement of approximately 33%,
The extracting information category was another focal point of the project,
because finding important information was indicated as one of the most difficult
results were not as evident as the methods used results, however, slight differences
Figure 4.25
Comparing the Difference in POPS Extracting Information Section Pre-Test and Post
Test Between Groups
Comparing the Difference in POPS Extracting Information Section Pre-Test and Post-Test
Between Groups
4.5
Extracting Information Pre-Test Extracting Information Post-Test
and an average post-test score of 6.67, improving an average of .84 points. Students
124
in the control group achieved an average pre-test score of 4.67 and an average
post-test score of 5.33, improving an average of .66 points. Therefore, there was little
difference between groups, but due to the small score, the average scores within the
The other categories were also compared between groups and the results are
Table 4.28
All three remaining categories were not as important to the project as the
methods used and extracting information. The correctness of answer and accuracy
categories were more mathematically centered than the other categories, and the
125
quality of explanation was based on the student's ability to explain their answer,
and without specific training, it is difficult for students to improve in this category. In
their scores on the post-test, while students in the control group actually showed a
decrease in scores on the post-test. While both group improved their scores on the
correctness of answers category, students in the control group improved their scores
the; total POPS problem solving test score, methods used score, extracting
information score, accuracy score and quality of explanation score than the students
in the control group. The results obtained from the POPS test show the experimental
group was able to produce more improvements in general problem solving than the
control group.
The hands-on problem solving test did not offer many results, showing similar
improvements for both groups. The easy problem was solved by all students in both
in hypothesis two, students achieved similar results. Overall, the hands-on problem
solving test provided very similar results in all aspects (See Table 4.31).
126
Table 4.29
Results from Easy Problem in Hands-on Problem Solving Test Comparing between
Groups
Comparing Time Improvements to Complete the Easy Hands-on Problem from Pre-Test to
Post-Test of the Control Group versus the Experimental Group
2
1 .5
I ---------
2
� �----.I
_.,_ Average Control Group
.11
-a-Average Experimental Group
i
:
15
ll
�
� 1.i::-,�:...;...;_..,.;ii!!.-;._.;_..___�/;!;_--..;;;:;:��::L,,-..:.::.....:.;;'!
Time to Complete Easy Problem Pre-Test Time to Complete Easy Problem Post-Test
Although a small sample group was used, differences arose between the
experimental group and control group pertaining to solving the difficult problem in
the hands-on problem solving test. During the pre-test, only 33.3% of the
experimental group solved the difficult problem, while 66.7% solved the difficult
problem in the post-test. The control group produced less dramatic results of 16.7%
of the group solving the difficult problem in the pre-test and 33.3% of the students
solving the problem in the post-test. Therefore, more students in the experimental
127
group solved the problem, showing a great improvement in their score than the
control group.
Figure 4.27
Comparing the Percentage of Students in the Control Group versus the Experimental
Group Able to Solve the Difficult Problem in the Pre-Test and Post-Test
Percentage of Students who were able to solve the difficult problem In the pre-test and post
test comparing the control group versus the experimental group average
100.0%
90.0%
�e
CL
10.0%
J 30.0%
20.0%
10.0%
0.0%
H02: Pre: Solution H02: Post: Solution
The number of attempts was the last section of the hand-on problem solving
test to analyze and again the experimental group increased the number of attempts in
the easy problem and the difficult problem from the pre-test to the post-test (See
Table 4.32).
The experimental group increased from an average of 2.17 attempts in the pre
test to an average of 2.5 attempts in the post-test on the easy problem in the hands-on
problem solving test. The control group decreased from an average of 2.67 attempts
in the pre-test to an average of 1.33 attempts in the post-test on the easy problem in
The experimental group increased from an average of 2.17 attempts in the pre
test to an average of 2.83 attempts in the post-test on the difficult problem in the
hands-on problem solving test while the control group decreased from an average of
4.83 attempts in the pre-test to an average of 2.67 attempts in the post-test on the
Overall, the experimental group increased their attempts during the post-test,
while decreasing the amount of time to complete the test comparable to the control.
The control group used less attempts while decreasing the amount of time taken to
complete the problem comparable to the experimental. The most dramatic results
were shown in the percentage of students who completed the difficult problem during
experimental group solved the difficult problem during the post-test than the students
The group interview was conducted after the training session, on the last day
of post-testing. All students from the experimental group and control group
participated in the group interview. The researcher first had the students write
responses to the questions on a sheet of paper, and then verbally discussed each
question as a group. One of the questions presented to the students during the group
interview was "Do you think learning how to troubleshoot a computer helped you
with your problem solving skills? Why or why not?" Students were first asked to
respond to the question with a written reaction, and then they were asked to verbalize
any additional answers as a group. All the students in the experimental group chose
Student #2 in the experimental group wrote: "Yes. Because you had to figure out
what the problem is and you have to think of the solution of the problem."
Student #4 in the experimental group wrote: "Yes Because you learned how to fix
Student #6 in the experimental group wrote: "I think it helped by learning strategies
like in math."
Student #8 in the experimental group wrote: "Yes I think troubleshooting will make a
difference because finding out [what's] wrong with a computer is a lot like finding
make a difference because finding out [what's] wrong with a computer is a lot like
Student #12 in the experimental group wrote: "No not really because I was just
Five out of six students in the experimental group felt the computer
made references to "figure out" problems, "fix things" and "finding out" information,
which are key elements and steps in the problem solving process. They also
mentioned "finding out what the solution is" and one student even compared the
skills, was teacher-rated as a low problem solving abilities student, who may still be
discussing the question. The excerpt from the transcribed conversation (Table 4.33)
showed how students verbally responded to one of the questions in the written
interview "Do you think learning how to troubleshoot a computer helped you with
your problem solving skills? Why or why not?" The researcher began the question
discussion by prompting the students with the question and asked their opinion.
131
Students believed that the computer troubleshooting sessions had an effect
on their problem solving skills. They drew references to the similarities between
to having to "figure out" problems and "finding out what the solution is" comparing
the solving of computer problems to the solving of general problems. Overall, the
Table 4.31
Speaking
#7 (control) No.
#4 (experimental) I think yes because we actually learned like, cause you didn't
help us that much. You just kind of took apart the computer
and we had to think of all the parts that were missing and stuff.
#8 (experimental) I had to figure out what was wrong with the computer. And it
#2 (experimental) Yes because we had to figure out what the problem was.
#12 (experimental) No not really because it was just taking apart the computer.
#6 ( experimental) I think that it would help with like strategies and stuff because
132
like we had to use different strategies.
RESEARCHER OK so different strategies that you had to use. Why don't you
#4 (experimental) The first couple days we were just studying like what parts of
computers and um... and put them back together. Then like the
last day, she took apart a computer and we had to put it back
RESEARCHER You had to fix it. You had to figure out what it was first and
then you had to fix it. So you had to identify the problem.
RESEARCHER Did the people who went through the computer troubleshooting,
All Yeah.
RESEARCHER Do you think that if your teacher had a problem with the
All Yeah.
#4 (experimental) We did it in like art. The printer wasn't working; it was the
RESEARCHER Alright there you go. That's fantastic guys! Thanks a lot. I
I really appreciate it.
133
I
CHAPTERV
Introduction
training can also prepare students to assist in computer labs. The similar processes in
devising a solution and fixing the problem successfully. The researcher believes
the most difficult part of the problem solving process for elementary students is
identify the problem. Based on the evidence found in this study, elementary students
computer has the potential to improve problem solving ability in elementary students.
134
135
Summary of the Study
(Coleman, et al, 2001; Jonassen, 2000). Problem solving skills are essential for a
student's future. Providing students with the skills to solve problems as opposed to
merely supplying them with content knowledge enables the students to transfer the
content knowledge to various situations requiring problem solving (Casey & Tucker,
1994). The research project explored the problem solving requirements necessary in
computer repair and troubleshooting, and their effect on the academic achievement
and academic problem solving of elementary students. Computer repair methods and
strategies. The study proposed to increase problem solving abilities and academic
technology curriculum. With the findings from this research study, schools could
incorporate the computer repair and troubleshooting training program into the
curriculum. This could serve to enhance the problem solving learning experience and
design. The whole study lasted two school weeks, including all testing periods. The
hands-on problem solving test, and a survey collecting information from the students
on problem solving skills, attitudes and math abilities. Following the two days of pre
which were held for forty-five minutes in the morning before school over the course
of five days. The control group received no training. Following the training, the
students from the control and experimental groups were post-tested. The post
assessment included two standardized tests, the hands-on problem solving test and a
group interview modeled from the survey. The pre-tests and post-tests were
compared through statistical analysis, graphs and tables, as well as focusing on each
Hypothesis 1
The researcher used the findings from the troubleshooting activity as a means
different common computer problem for the students to assess and solve. The
137
students worked in teams of two and solved all six problems, with the exception of
one team who, due to time constraints, was only able to solve five of the six
problem and the measures they used to fix the problem. Through the computer
by; reattaching power cords and peripheral devices, installing software programs, and
The findings from the group interview were used to assess whether the
group interview, students in the experimental group explained the characteristics and
requirements of the computer training sessions to the students in the control group.
Students discussed taking apart computers, fixing the problems and installing
programs. When asked if they would be able to assist their teacher with a computer
Findings from these two methods of assessment would suggest that students
Hypothesis 2
The researcher used findings from the Methods Used section of the POPS test
as a means to assess problem solving methods. Within the Methods Used section,
improved their score by one point, and the average score for the group decreased
0.167 points between the pre-test and post-test. The students in the experimental
group all improved by one point or more, creating an average improvement of 2.5
experimental group improved their problem solving methods was the hands-on
problem solving test. The average number of attempts for the easy problem and the
difficult problem was analyzed for each group. The experimental group used more
attempts on average in the post-test than in the pre-test, while the control group
achieved opposite results. The time necessary to complete the easy problem was also
compared using average pre-test and post-test times of both groups. The
improvement time for the control group and the experimental group were very
similar, varying by only 2.8 seconds. The last set of findings in the hands-on problem
solving test analyzed for the second hypothesis, were the number of students in each
group able to solve the difficult problem. Four students in the experimental group
solved the difficult problem correctly, while only two students in the control group
Findings from the Methods Used section of the POPS would suggest that
and repair computers. Findings from the hands-on problem solving test were found to
The researcher assessed the most difficult procedure in problem solving for
elementary students by using findings from the Profiles of Problem Solving test.
Since the POPS test assessed students on different elements of problem solving, each
pre-test scores of the students. The most difficult procedures of problem solving, in
used.
The second source of findings used to discover the most difficult problem
solving procedure was the survey/group interview. When asked what the most
difficult part of solving a problem out of five procedures, eight out of the twelve
students responded with "understanding the question" was the most commonly
Findings from the POPS test would suggest that students have the most
difficulty with; correctness of answer, extracting information and methods used in the
problem solving process. Findings from the group interview would suggest that
Hypothesis 4
Information was collected from the IOWA Basic Skills math problem solving
and data interpretation test and the group interview, in order to analyze whether
140
mathematical ability would be affected by the training sessions. The IOWA test
results indicated that the average scores of the control group and the experimental
to figuring out the problem and finding a solution to the problem. One student also
strategies in math.
Little evidence was found to support the improvement of math skills within
Hypothesis 5
test and the hands-on problem solving test. The POPS pre-test and post-test total
scores were compared between genders, as well as each category of the POPS test.
The females improved from a mean total score of 26.5 to 32.5, while the males
retained a constant total mean score of 24.3 between the pre-test and the post-test.
When separated by gender and group, the females in the experimental group
improved by an average of 22% between the pre-test and the post-test, while the
females in the control group improved by an average of 14%. The males in the
experimental group improved by an average of 12%, while the males in the control
group decreased their score by an average of 11%. Throughout the POPS categories,
141
the females outperformed the males in overall scores and in the improvement in
Additional results were collected from the hands-on problem solving test. The
researcher compared the completion time for the easy problem from the pre-test to the
post-test. The males in the experimental group improved their completion time of the
easy problem by a greater percentage than the males in the control group. However,
the females in the control group demonstrated greater improvement than the females
The findings for the number of attempts to solve each problem was also
studied from the hands-on problem solving test to analyze the affects of gender on the
results of the study. The males in the experimental group were the only group to have
increased their average number of attempts in the post-test as compared to their pre
The findings from the Profiles of Problem Solving test would suggest that
there is little or no difference between gender-based results. Findings from the hands
on problem solving test presented conflicting data, the female control group and the
male experimental group were able to improve in multiple areas. The findings were
inconclusive.
Hypothesis 6
solving abilities were compared to evaluate if the variable was a significant factor.
142
The students in the high and medium experimental group improved their total POPS
score by an average of eight points, while the high and medium control group
improved their total POPS score by an average of three points; The low problem
solving group decreased their average total POPS score from the pre-test to the post
test. When the ability groups were analyzed by treatment group, the experimental
group's improvement was more significant between the pre-test and post-test than the
control group.
The hands-on problem solving test exhibited continuity across the ability
problem solving groups. Each level of high, medium and low students demonstrated
a similar improvement in time completion of the easy problem. High ability level
students in the experimental group were the only students to increase their average
number of attempts between the pre-test and the post-test. The number of students
able to solve the difficult problem was also analyzed, but results were inconclusive
When comparisons of the high, medium and low problem solving ability
students were made, the findings from the POPS test would suggest that regardless of
problem solving ability levels, students can improve their problems solving methods
by learning to troubleshoot and repair computers. The findings also indicated that the
high and medium ability level students in the experimental group were able to
improve their scores more than the low ability student. Findings from the hands-on
Information from the Profiles of Problem Solving test, the hands-on problem
solving test and the survey/interview were all used to investigate the final hypothesis.
The mean total POPS score was averaged for each group, comparing the pre-test total
score to the post-test total score. The findings demonstrated a 25% increase for the
post-test score of 31 points. All individual sections of the POPS test favored the
time completion of the easy problem within the hands-on problem solving test. The
20% from the pre-test tot the post-test, while the control group achieved opposite
results, decreasing in the average number of attempts by 5%. 33% of the control
group was able to solve the difficult problem, while 66% of the experimental group
The last collection of data analyzed to address the hypothesis was the final
group interview. When students were asked if they felt computer troubleshooting
could make an impact on problem solving skills, five out of the six students in the
References to figuring out problems and finding solutions were some of the additional
comments given by students concerning the relationship between the two processes.
144
The findings from the Profiles of Problem Solving test would suggest that
students could improve their problems solving skills by learning to troubleshoot and
repair computers. Findings from the hands-on problem solving test were found to be
minimal and insignificant. The findings from the group interview would suggest that
Conclusions
Hypothesis 1
components and repair. The training sessions included; the installation of programs,
other common computer problems. Once the students received all · the training
problems; identifying why the problem exists and how to fix the problem. The
problems, presenting a different problem at each of the six stations. The teams
needed to study and solve the problem at each station, while recording their answers
on the team worksheet. All three teams were able to solve all six problems, with the
145
exception of one team who, due to time constraints, was only able to attempt five of
the six stations. The elementary students exhibited their troubleshooting ability by
attempted, students identified the problem, devised a s·olution and fixed the problem
The group interview was also used to assess whether the elementary students
problem] in like art. The printer wasn't working; it was the same problem [we had
experimental group stated that they had gained the ability to take computers apart, fix
attempted common computer problems. The group interview confirmed that students
were comfortable with their new ability to solve computer problems, the researcher
were asked if they could assist their teacher with a computer problem in the future,
The Methods Used section of the POPS test and the hands-on problem solving
test was used in order to assess whether the experimental students improved their
problem solving methods. Most control students' scores either remained constant or
decreased in their score between the pre-test and the post-test results, while all the
improvement of 2.5 points or 33%, between the pre-test and the post-test results. The
improvement in the test results after the training sessions. The computer trainings
sessions made a difference in the problem solving methods used by students in the
experimental group.
Overall, the hands-on problem solving test was not a significant factor
towards improving the problem solving methods. The number of attempts varied
widely between the groups. The experimental group demonstrated an increase in the
average number of attempts between the pre-tests and post-tests, while the control
group showed a sign ificant decrease in the average number of attempts. On average,
the control group improved by completing the easy problem in less time as compared
the experimental group, but the results were minimal and inconclusive. Four of the
students in the experimental group solved the difficult problem in the post-test, while
only two students in the control group solved the difficult problem in the post-test.
147
However, these results were also minimal, and due to the small number of student,
the data is considered insignificant. The researcher believes the findings from the
hands-on problem solving test were minimal and do not support the hyp othesis.
Hypothesis 3
The most difficult procedure in problem solving for elementary students will
difficult part of solving a problem. Eight out of the twelve students responded with
the category "understanding the question". The students classified under this
category used key phrases such as; "I don't understand [the question]", "I don't know
what the question is asking you to do", and "what the problem is looking for". Based
on these responses, students had difficultly determining what they needed to use the
problem solving. Each component of the problem solving process was analyzed by
observing the pre-test scores of students. In the POPS test, the categories consisted
of; correctness of answer, methods used, accuracy, extracting information and quality
of explanation. The two categories most similar to "understanding the question" were
methods used and extracting the information. Findings from the POPS test results
indicated that the most difficult procedures for the students, in order of difficulty were
most difficultly with understanding and setting up the problem in order to solve it.
Hypothesis 4
The information collected from the IOWA Basic Skills math problem solving
troubleshooting. Students made verbal and written references to figuring out the
problem and finding a solution to the problem. One student also made the
writing, "I think [computer troubleshooting] helped [my problem solving skills] by
troubleshooting.
computers.
149
Hypothesis 5
Gender will not impact the problem solving ability of elementary students
The existence of the gender gap in computer usage has been shown through
multiple studies indicating that females lack positive· educational experiences with
computers (Burge, 2001). The major form of assessment used in analyzing possible
gender differences was the Profiles of Problem Solving test. Females in the
experimental group improved by 22% and the females in the control group only
improved by 14%. Likewise the males in the experimental group improved by 12%,
whereas males in the control group decreased by 11% in their total POPS score. It is
difficult to conclude whether males or females were impacted more through the
training sessions, although it is possible to conclude that the training session made an
The hands-on problem solving test provided additional findings, but the
results were difficult to evaluate. The males in the experimental improved more than
the males in the control group regarding the completion time of the easy problem
from the pre-test to the post-test. However, the females in the control group
improved more than the females in the experimental group regarding the completion
time of the easy problem from the pre-test to the post-test. The number of attempts to
solve each problem was another source of data evaluated to analyze the effects of
gender on the results of the study. The males in the experimental group were the only
group to have increased their average number of attempts in the post-test as compared
150
to their pre-test average attempts. Due to the small sample and the nature of the test,
the conclusions drawn from the hands-on problem solving test were inconclusive.
Hypothesis 6
The students rated as the high, medium and low problem solving ability levels
by the teachers in the experimental group improved their total POPS score from the
pre-test to the post-test more than the control group with similar ability levels.
Therefore, the students in the experimental group were able to outperform every
student with equivalent ability levels in the control group, demonstrating the positive
educational impact of the training session. Based on the POPS data, the students
rated with high and medium ability levels were able to achieve better results than
Both students rated as low ability problem solvers achieved lower total scores
on their POPS post-test, than on the pre-test. However, the scores of the student in
the experimental group identified as a low ability problem solver decreased less than
the control student classified as a low ability problem solver. Interestingly enough,
both students in the low problem solving group stated in the group interview that they
problem solving skills. Both students felt computer troubleshooting would have no
151
impact, stating that "because I was just learning [to] put [together] and take apart [a
computer]". The researcher hypothesizes that the students rated as low level problem
solvers by their teachers may only have been able to function at a concrete level of
improvement in the time completion of the easy problem from the pre-test to the post
test. More of the low and medium problem solving ability students were able to solve
the difficult problem in the hands-on problem solving test. The researcher
hyp othesizes that students rated as having low and medium ability levels may have
excelled on this portion due to the hands-on manipulation of the test. The hands-on
problem solving test produced results, but the conclusions were questionable due to
Based on the findings, students rated as having high, medium and low
problem solving ability by their teachers improved their problem solving ability
within the experimental group as opposed to the control group. However, evaluating
which ability group improved their problem solving ability the most was difficult to
Hypothesis 7
will demonstrate greater improvements in problem solving ability than students who
test total score to the post-test total score, the experimental group improved by 20%,
while the control group only improved 3%. The students in the experimental group
demonstrated a larger improvement in all sections of the POPS test, except the
correctness of answer section. A paired samples t-test was run on the total POPS
score data and a significance difference existed between the experimental group's
scores and the control group's scores (See Appendix S). The students in the
In the hands-on problem solving test, the experimental group increased their
average number of attempts during both problems from the pre-test to the post-test,
while the control group decreased in their average number of attempts. The difficult
problem was solved by 66% of the experimental group, while only 33% of the control
group was able to solve the difficult problem. The findings from the hands-on
problem solving test was inconclusive, because while the experimental group
contained more students who solved the difficult problem, the numbers were too
computer troubleshooting made an impact on their problem solving skills. Five of the
six students in the experimental group believed that the computer troubleshooting
troubleshooting training made a difference in the general problem solving skills of the
abilities.
process. The Board offered to meet with the researcher on several different
the study, in order for the students to gain more hands-on experience with the internal
hardware components.
Elementary School as a setting for the research project. The building and resources
research study. The principal, as well as several parents stated that the study was a
"wonderful educational opportunity for the children". The overall response to the
Elementary School, was ideal and provided a chalkboard, desks, paper, pencils and
with the computers because of the additional computers from the University and the
The curriculum and the instructor were additional positive factors in the study.
certified and a certified elementary educator, meeting the instructional needs of the
students.
has multiple implications for the elementary curriculum, as well as for providing
assistance to teachers in the computer labs. However, there were many limitations
The sample size of this study was created for convenience purposes. With any
study, the more subjects that participate, the greater the ability to generalize and the
more creditable the statistical tests are to determine the findings. Since the student
sample in this study was small, only twelve students, the findings could not be
generalized widely. The study did not allow for a completely random sample, and
therefore, cannot be generalized to the entire fifth grade population in Dearborn. The
location, demographics of the students, and other variables also limited the
155
implications of the research. By limiting the current study to only six students in each
group, the researcher was able to provide ample amounts of hands-on, individualized
instruction, as well as manage the large amounts of data collected from each student.
The researcher also encountered difficulties gaining access to students, but was
grateful to the Dearborn Public Schools for allowing the research project to take
place. The time available to assemble and prepare the sample groups was limited
based on the researcher's need to complete the study before the end of the school
year. Future studies should consider implementing a sample size of one hundred
twenty subjects, more time to conduct the training, and more resources for the
The matching of students based on the disaggregates within each group was
records and problem solving ability. Gender· was not found to be a significant
variable in the findings of the study. Attendance, used to exclude participants from
the research project, was not a factor in the study. The researcher assigned students
as high, medium or low problem solving ability status, based on ratings from their
classroom teachers. The sample was taken from three separate fifth grade
classrooms. Since there was no set definition of high, medium or low ability,
teachers created the ratings based on their own classroom observations and
problem to control this variable. Through pre-testing, findings from the research
could be focused more on directly comparing total scores, rather than comparing the
156
improvement m points for each form of assessment. The researcher would
categorize students.
Problem solving skills were difficult to monitor, assess and evaluate. The
most common form of analysis requires subjects to verbally state every step they use
during the problem solving process and the reasoning for those processes. However,
due to time restraints and difficulty for elementary students to articulate their
solving test would be a more valid and effective measure of problem solving skills.
The researcher was unable to locate many problem solving tests, and the testing
methods retrieved from ETS, Educational Testing Services, were not endorsed.
additional limitations. Because the curriculum was designed and taught by the
researcher, certain biases of the researcher were likely to be a part of the study. In
future studies, care should be taken to eliminate all possible researcher biases. The
researcher feels that curriculum revisions based on suggestions and videotaping of the
A reliable, cost efficient hands-on problem solving test was the most difficult
involving the use of tangrams. Hands-on intelligence quotient tests were available,
but due to the cost of the tests and lack of funding, alternate tests were developed by
157
the researcher. The researcher would recommend that additional effort be made for
finding a more appropriate hands-on problem solving test for future research.
The IOWA Basic Skills test and the Profiles of Problem Solving test were
retrieved from the ETS, Educational Testing Services, but were not endorsed. The
researcher attempted to select tests that had been used in multiple dissertations, thesis
and research projects, to improve the credibility of the study. The researcher would
recommend that additional effort be made to find more appropriate standardized tests
The researcher was also under time constraints such as; the end of the school
year, HSIRB deadlines and University deadlines for the project. Future research
projects should expand the time frame to four weeks. If students could be provided a
longer training period and more tailored instruction, the students may be able to
Final Conclusions
Based on the evidence presented in this study, elementary students are capable
of learning how to and can troubleshoot common computer problems. The most
"understanding the question" or the problem. When students are provided with a
concrete hands-on problem, it is much easier for the student to establish a problem
and develop strategies and methods to solve the problem. The computer becomes a
valuable tool for the student to manipulate, simplifying the establishment of the
problem and the problem solving procedure. The computer troubleshooting process
158
also provides students with immediate feedback as to whether they successfully
solved the problem. The researcher believes there is a definite relationship between
computer troubleshooting skills and general problem solving skills. The similarities
of identifying the problem, devising a solution and fixing the problem successfully
exist in both computer troubleshooting and other forms of problem solving. The
further research, computer troubleshooting has the potential ability to enhance the
Burge, K. (2001). UCI Computer Arts: Building Gender Equity while Meeting !STE
NETS. Paper presented at the In Building the Future, NECC 2001, Chicago, IL.
Chang, S., Hanson, B., & Harris, D. (2001). A Comparison of the Standardization
and IRT Methods of Adjusting Pretest Item Statistics Using Realistic Data. Paper
presented at the Annual Meeting of the Educational Research Association, Seattle,
WA
Chapman, B. & Allen, R. (1994). Teaching Problem Solving Skills Using Cognitive
Simulations in a PC-Environment. Journal of Interactive Instruction Development,
Spring, 24-30.
Coleman, C., King, J., Ruth, M., & Stary, E. (2001). Developing Higher-Order
Thinking Skills through the Use of Technology. Master of Arts Action Research
Project, Saint Xavier University and Skylight Professional Development Field-Based
Mater's Program.
Du, Y. (2002). Sampling Theory and Confidence Intervals for Effect Sizes: Using
ESCJ to Illustrate "Bouncing" Confidence Intervals. Paper presented at the Annual
Meeting of the Southwest Educational Research Association, Austin, TX.
159
160
Goldman, S., Cole, K. & Syer, C. (1999). The Technology/Content Dilemma. Paper
presented at the Secretary's Conference on Educational Technology-1999. Retrieved
April 2, 2002 from
http://www.ed.gov/Technology/TechConf/1999/whitepapers/paper4.html
Howard, B., McGee, S., & Shin, N. (2001). The triarchic theory of intelligence and
computer-based inquiry learning. Educational Technology Research and
Development, 49 (4), 49-69.
Jereb, Janez. (1996). The Technical Problem and Its Didactic Function. Paper
presented at the Jerusalem International Science and Technology Education
Conference on Technology for a Changing Future: Theory, Policy and Practice.
Kirkpatrick, H. & Cuban, L. (1998) Computers make kids smarter - right? Technos
Quarterly, 7 (2).
Krebs, D. & Clark, B. (2000). Camp Invention: connects to classrooms. Gifted Child
Today, 23 (3), 28-32, 52.
Lavonen, J., Meisalo, V. & Lattu, M. (2001). Problem Solving with an Icon
Oriented Programming Tool: A Case Study in Technology Education. Journal of
Technology Education, 12 (2), 21-34.
McGrath, D. (1988). Programming and problem solving; Will two languages do it?
Journal of Educational Computing Research, 4 (4), 467-484.
Williams, D., Liu, M., & Benton, D. (2001). Analysis of Navigation in a Problem
Based Learning Environment. Association for the Advancement of Computing in
Education.
Wilson, P. Fernadez, M. & Hadaway, N. (1993). Research Ideas for the Classroom:
High School Mathematics. New York: MacMillian.
163
164
�.
�
1903-lOOl Celebration
This letter will serve as confumation that your research project entitled "Effects of
Computer Troubleshooting on Elementary Students' Problem Solving Skills" has been.
approved under the full category of review by the Human Subjects Institutional Review
Board. The conditions and duration of this approval are specified in the Policies of
Western Michigan University. You niay now begin to implement the research as
described in the application.
. Please note that ypu may only conduct this research exactly in the form it was approved.
You must .seek specific board approval for any changes in this project. You must also
seek reapproval if the project extends beyond the termination date noted below. In
addition . if there arc any unanticipated adverse reactions or unanticipated events
associated with the conduct of this research, you should immediately suspend the project
and contact the Chair of the HSIRB for.consultation.
The Board wishes you success in the pursuit of your research goals.
165
166
and Effects on Elementary Problem Solving." The purpose -of the study is to see if the
training with the Computer Troubleshooting and Repair program will help you with your
Before the training starts, you will be tested using 2 standardized tests, 1 hands-on
problem solving activity and a survey to tell how you feel about problem solving. You
will also be tested after the program using the 2 standardized tests again, the hands-on
problem solving test and an interview, by Mrs. Ottenbreit, to see if you improved in
problem solving. Even if you agree today to participate by signing this form, you can
change my mind at any time when we begin training or at any time during the training.
You are volunteers and you are free to stop participating whenever you want and there
Your name will not be on any of the forms or videotapes. The researcher will use
a code number instead. The researcher will keep a list of names and code numbers that
will be destroyed once the researchers have recorded the important information. If you
have any questions or concerns about this study, you may contact either Anne Ottenbreit
3130.
Appendix C
167
168
APR 1 6 2003
x-·-·
····tn � ;;?�r--7
··HSI Ch* 7
Dear Parents,
The fifth grade students at Haigh Elementary School have the opportunity to participate
in a research project conducted by a graduate assistant in educational technology from
Western Michigan University during the month of May. This research project will
include 12 5th grade students who volunteer to be in the study. However, only six
students will be selected for the experimental group, which are the students who will be
assigned to receive the training, and 6 students will be selected to participate in the
control group, which wm not participate in the training sessions. It is important to have
some students who are trained and some who are not trained so we can assess the value
of the training. A selection process will be used to choose the six students to participate
in the project if more than 12 students volunteer. The selection process will be preformed
by the researcher, based on the following criteria: gendet, attendance records, and
problem solving ability. The control group will be an important part of the project,
measuring against the experimental group to see if the training program can make a
difference.
The researcher believes the training sessions will have an affect on the students' problem
solving skills and math ability. These children will learn about the different parts of a
computer and their functions and to be taught to troubleshoot if the computer is not
working. They will be using a hands-on approach and working with an actual computer.
The children will be instrncted by the Western Michigan University graduate assistant
and supervised by Mrs. Ottenbreit. It will require that these students attend a two-week
course, five days a week from May 19th through May 30th, 2003.-. The computer training
will be done in Room 4, Mrs. Ottenbreit's room, from 8:00 A.M. until 8:45 A.M. Prior to
and at the end of the training sessions, these children will be tested to se.e if the computer
training has improved their problem solving skills as opposed to those that did not receive
the training. Your child must be able to attend all ten sessions.
All students who volunteer for the study will be provided with a Creative Co·mputer
Night, making their own animated/narrated story on CD. Each student will be able to
keep the CD. This should be a wonderful opportunity and learning experience for your
child!
The 12 students who participate in the study will be asse§sed 011 their problem solving
skills and math problem solving skills. The pre- and post-tests '>"ill take about one hour
and will be done in a relaxed and positive testing climate. The testing results will be kept
confidential and the students' names will not be used in the research's report. The results
169
APR 1 6 2003
X '[11 l'J-,HIBd' �
� �-�r
· HS
of the training sessions and the project will be shared with the classroom teachers,
however, no individual student results will be released to the teacher.
If you have any questions, please contact Mrs. Ottenbreit at 730-3130. Please return the
permission slip below by Friday, May 16th, 2003. Thank you.
Sincerely,
---------------------------------------------------------------------------------------------------
Problem Solving Research Project
Western Michigan University
Student
S'/;J/rJ.J
Date
170
171
Your child has been invited to participate in a research project entitled "Effects of Computer
Troubleshooting and Repair on Elementary Problem Solving Skills." The purpose of the study is
to determine the usefulness of computer troubleshooting and repair curriculum in preparing
elementary students in problem solving skil1 development. This project is being conducted to
fulfill Anne Ottenbreit's thesis requirement.
Your permission for your child to participate in this project means that your child will be
administered the IOWA Basic Math Skills Test and the Profiles of Problem Solving
Standardized Test. The testing will take place during May and will involve about one hour. Your
child will also be administered a hands-on problems solving test twice as a pre- and post-test.
The process will be videotaped in order to document problem solving skills. All tests will be
conducted in a positive testing environment by Sharon Ottenbreit. Your child will also be taken
through computer training sessions, which will last 2 weeks starting on May 19th . Your child
will be free at any time -- even during the test administration -- to choose not to participate. If
your child refuses or quits, there will be no negative effect on his/her school programming. The
test results will be used to establish a baseline data collection providing the researcher with
information on current levels of problem solving. The results will help \\-1th subject selection for
the control group and experimental group. Although there may be no immediate benefits to your
child for participating, there may eventually be benefits to the school district and subsequently to
students in technology education programs. The researcher believes the training sessions may
have a positive effect on problem solving skills. If the results of the actual project are found to
be useful, then current technology education program could be modified to include repair and
troubleshooting within the curriculum.
All test data and information will remain confidential. That means that your child's name will be
omitted from all test forms and a code number will be attached. The principal investigator will
keep a separate master list with the names of the children and the corresponding code numbers.
If the researchers find that these two tests are useful for planning your child's programming; they
will share the results with your child's teacher, unless specified otherwise. Once the data are
collected and analyzed, the master list will be destroyed. All other forms will be retained for at
least three years in a locked file in the principal investigator's office. No names will be used if
the results are published or reported at a professional meeting.
The only risks anticipated are minor discomforts typically experienced by children when they are
being tested (e.g., boredom, mild stress owing to the testing situation). All of the usual methods
employed during standardized testing to minimize discomforts will be employed in this study.
The other possible risks are those involved in computer repair (e.g., minor scrapes, and in
extreme rare cases, electrostatic shock). All computer safety methods will be exhibited by the
instructors. More information can be provided if requested on the exact safety measure taken by
Appendix E
172
173
Risks to Subjects
Potential risks to the subjects are as follows:
1. Subjects could miss academic learning time m the classroom in order to
participate in brief testing.
2. Students could experience frustration or boredom during the testing and/or
training.
3. Subjects could feel uncomfortable about videotaping in the classroom.
4. Students could experience minor scrapes due to the hands-on nature of the
program.
5. Students, in extreme rare cases, could receive an electric static shock.
Protection of Subjects
Possible risks to subjects are extremely limited due to the extremely precautions safety
procedures. Student risks could include slight boredom from the tests. Other risks could
include minor scrapes from computer edges and in severe cases, slight electric static
shocks. However, the instructor has carefully designed the curriculum to reflect safety
precautions expressed in the A+ technician requirements. The instructor is a certified
elementary educator and a certified computer technician, building her own computer, and
is able to monitor safety of the small experimental group. There will also be an
additional elementary school teacher to assist in safety features during the training
sessions and testing sessions.
In order to minimize the potential risks listed above, the following precautionary measure
will be taken:
1. The researcher will attempt to limit testing time, or utilize a period in the day
when students would be conducting non-learning activities. The training will be
taking place before school, so training will not use academic learning time in the
classroom.
2. The subjects will be reminded that the test needs to be completed to the best of
their ability. Teacher will reiterate: "This is not a grade, I just want to find out
what you already know, so I don't teach you the something you already know."
The teacher for signs of boredom or frustration will also monitor subjects during
test taking procedures. Students experiencing these problematic feelings will be
allowed to take a break and try again. If this does not succeed, students may stop
taking the test. The instructor will monitor students experiencing boredom or
frustration during the training procedures. The difficulty of the program will be
adapted accordingly. The students are broken up into pairs and this is easily
manageable, plus the grouping may keep otherwise bored/frustrated students
engaged.
3. The videotape will be set up prior to taping so as not to disrupt the training. If
the student continues to feel uncomfortable, the instructor will move the video
camera to a hidden location.
4. Students will be instructed on the first day how to handle the computer
equipment. Every time before they are allow to interact with the machine, they
will be reminded of the metal parts and how to safely manipulate the equipment.
174
The instructor will model careful manipulation of the materials, to illustrate to the
students how to correctly handle the equipment. Band-aids and first aid kits will
be keep on hand at all times.
5. Students will be instructed on the first day how to handle the equipment. Students
will remember to ground themselves, which requires touching a table or an object
which is not electrostatic charged and to put the ESD strap on, which grounds
them to a table, before working on the computer. There is no supposed risk if the
computer is not plugged in. The instructor will remove all power cords prior to
the beginning of the training and place them in a safe container in the room.
When students reach the point of training to anticipate whether they can
troubleshoot the hardware problems, the instructor will then place the power cord
back into the computer, model the safety techniques, ensure the students complete
the same procedure and then work together. In case of extreme rare
circumstances, a cellular phone will be kept on person at all times, as well as
student emergency slips, in case of emergency.
The instructor is a certified computer technician through CompTIA A+ certification
training and testing. The A+ curriculum learning objectives included:
• This domain requires the knowledge of safety and preventive maintenance. With regard to safety,
it includes the potential hazards to personnel and equipment when working with lasers, high
voltage equipment, ESD, and items that require special disposal procedures that comply with
environmental guidelines. With regard to preventive maintenance, this includes knowledge of
preventive maintenance products, procedures, environmental hazards, and precautions when
working on microcomputer systems.
• Identify the purpose of various types of preventive maintenance products and procedures and
when to use and perform them : Liquid cleaning compounds; Types of materials to clean contacts
and connections; Non-static vacuums (chassis, power supplies, fans).
Identify issues, procedures and devices for protection within the computing environment, including people,
hardware and the surrounding workspace : UPS (Un-interruptible Power Supply) and suppressers;
Determining the signs of power issues; Proper methods of storage of components for future use; Potential
hazards and proper safety procedures relating Lasers High-voltage equipment; Power supply; CRT. Special
disposal procedures that comply with environmental guidelines: Batteries; CRTs; Toner kits/cartridges;
Chemical solvents and cans; MSDS (Material Safety Data Sheet). ESD (Electrostatic Discharge)
precautions and procedures: What ESD can do, how it may be apparent, or hidden; Common ESD
protection devices; Situations that could present a danger or hazard.
(http:/lwww.comptia.org/certification/Aldefault.asp.)
Appendix F
175
176
Student Name:
Code Number:
Circle Test:
Survey
Interview
Appendix G
Survey
177
178
ID Number: -----------------
Date: - - - - - - - - - - - - - - - -
-
Directions: Below are some statements about your own thinking about problem solving.
There are no rights or wrong answers. Use the following scale and write the
number which best describes how much the statement is like you. Please
answer honestly and do not skip any statements.
Not me at all Very little A little like Kind of like A lot like me Describes
like me me me me perfectly
1 2 3 4 5 6
_ _ 1. Before trying to solve a problem I try to compare it to one that I have solved
before.
__ 5. Before trying to solve a problem I say the information over again in my own
words.
__ 7. When I am stuck on a problem I ask myself, "Did I look at all of the important
information in the question?"
Questions 8, 9 and 10 will help me figure out how you solve problems. Answer them in
your own words.
10. When you don't know what the solution is, what can you do? --------
Appendix H
Interview
180
181
ID Number:
INTERVIEW FORMAT
Not me at all Very little A little like Kind of like A lot like me Describes
like me me me me perfectly
1 2 3 4 5 6
__ 1. Before trying to solve a problem I try to compare it to one that I have solved
before.
_ _ 5. Before trying to solve a problem I say the information over again in my own
words.
__ 7. When I am stuck on a problem I ask myself, "Did I look at all of the important
information in the question?"
Code Sheet
182
183
Code Sheet
Access Allowed to Dr. Howard Poole, Anne Ottenbreit and Sharon Ottenbreit
Students will be placed into the slots as they tum in their permission slips to participate in
the project. The control group will be the last six, and will be decided after the testing.
The testing information will be kept in a separate database, assigning information to the
student identification numbers only.
10 E
6 E
8 E
4 E
2 E
7 C
5 C
1 C
11 C
9 C
3 C
Appendix J
184
185
AMOUNT OF TIME
Solved correctly __
METHODS
AMOUNT OF TIME
0:15 0:30 0:45 1:00
How long did the 1:15 1:30 1:45 2:00
student take to fully 2:15 2:30 2:45 3:00
solve the problem? 3:15 3:30 3:45 4:00
4:15 4:30 4:45 5:00
(Measured in minutes)
Students receive 5
minutes for each Did not solve at all --
problem.
Did not solve correctly __
Solved correctly __
METHODS
187
188
RESEARCHER(To the group):The first question is: Read So...ifthat's kind oflike you,
go ahead and fill it out like it says on the top. Do you guys want to fill out the first seven
yourselfand then we'll talk about the last few together
(To another student) You're all done ...I have one more thing for you.
(To the group) When you're done, can you go ahead and flip it over so I know you're all
done?
Talk about bathroom ...
Can you wait just a couple more minutes?
11 walks in.
RESEARCHER: You made it just in time.
2: Why did you come in now?
11: I forgot.
The noise made the talking between the students difficult.
RESEARCHER (to the second graders): Second graders, can you do me a big favor?
Can you keep your voices down because we are trying to do a videotape back here? So
can you do whatever work you are supposed to do, you're supposed to be studying your
spelling words? Thank you.
RESEARCHER (to the group): So tum it over now. Number 8 ifyou tum to the first
page, the back ofthe first page. It's right on the top line there. Go ahead and write down
a few things. Then we're going to share them as a group. Go ahead and flip it over when
your done and then we'll talk about it later. So what are some different ways that you
guys try to solve problems?
4: I look for all ofthe things I need and then I try and figure out the problem
You guys can just put down your pens and stuff down. We're just talking right now.
8: I try and find out what strategies to use.
RESEARCHER: Ok what kind ofdifferent strategies do you use
8: Like um whether to multiply, or add ...
RESEARCHER: Ok so you're trying to extract out all the information.
2: I try and find out all the information that I need and I sometimes ...
189
4: I think yes because we actually learned like, cause you didn't help us that much. You
just kind of took apart the computer and we had to think of all the parts that were missing
and stuff.
8: I had to figure out what was wrong with the computer. And it was a lot like trying to
figure out the problem.
2: Yes because we had to figure out what the problem was. reading
12: No not really because it was just taking apart the computer.
6: I think that it would help with like strategies and stuff because like we had to use
different strategies. read
RESEARCHER: OK so different strategies that you had to use. Why don't you guys try
to tell them a little bit about what we did?
4: the first couple days we were just studying like what parts of the computers were the
computers. We took apart the computers and um ... and put them back together. Then
like the last day, she took apart a computer and we had to put it back together with all the
parts.
6: We had to go to like stations and we had to figure out what it was and fix it.
RESEARCHER: You had to fix it. You had to figure out what it was first and then you
had to fix it. SO you had to identify the problem.
2: We had to install and uninstall a program.
RESEARCHER: Did the people who went through the computer troubleshooting, did
you guys have fun doing that?
All: Yeah.
RESEARCHER: Do you think that if your teacher had a problem with the computer that
you could fix it?
All: Yeah.
RESEARCHER: So now you can help your teacher out in lab?
4: We did it in like art. The printer wasn't working; it was the same problem here. So we
pressed the button and it worked.
RESEARCHER: Alright there you go. That's fantastic guys! Thanks a lot. I really
appreciate it.
Appendix L
192
193
194
195
Purpose
"To provide a comprehensive assessment of student progress in the basic skills."
Population
Grades K.1-1.5, K.8-1.9, 1.7-2.6, 2.5-3.5, 3, 4, 5, 6, 7, 8-9. ..LE-10: 5, 6, 7, 8, 9, 10, 11, 12, 13, 14.
Scores
Vocabulary, Listening, Language, Language Total, Mathematics, Core Total, Word Analysis (optional),
Mathematics Advanced Skills, Mathematics Total, Reading Advanced Skills, Reading Total, Reading,
Listening Language, Mathematics Concepts, Mathematics Problems, Mathematics Computation [optional],
Social Studies, Science, Sources of Information, Composite, Language Advanced Skills, Mathematics
Advanced Skills, Survey Battery Total, Reading Comprehension, Spelling, Capitalization, Punctuation, Usage
and Expression, Mathematics Concepts and Estimation, Mathematics Problem Solving and Data Interpretation,
Mathematics Total, Maps and Diagrams, Reference Materials, Sources of Information Total, Composite.
Time
(130-310) minutes for Complete Battery; (100) minutes for Survey Battery
Comments
196
Part of Riverside's Integrated Assessment System; Braille and large-print editions available
Appendix N
197
198
Purpose
"An assessment of mathematical problem solving designed for children in upper primary school".
Population
Grades 4-6.
Price
1993 price data: $75 per manual and photocopiable masters.
Administration
Group
Scores
5: Correctness of Answer, Method Used, Accuracy, Extracting Information, Quality of Explanation.
Manual
Manual, 1993, 64 pages
Time
[32]40 minutes
Comments
Appendix 0
199
200
201
---·------
· --·~·-------
Appendix P
202
203
Week 1:
Day 1: Pre-Testing
Day 2: Pre-Testing/Getting to Know You and the Computer
Day 3: Lesson 1: Outer Hardware, Intro to Hardware on the Inside
Day 4: Lesson 2: Hardware on the Inside
Day 5: Lesson 3: Storage, Files and Folders, The Windows Desktop
Week 2:
Day 1: Lesson 4: Knowing Your System, Programs, Operating Systems, Computer Care
and Safety
Day 2: Lesson 5: Troubleshooting Real Problems
Day 3: Post-Testing
Day 4: Post-Testing
Appendix Q
204
205
I. Researcher informs students and parents of project through the initial letter.
III. Researcher agrees to answer any questions parents or students may have.
A. Researcher provides a copy to sign and a copy for the parent to keep.
V. If parent does not wish to sign the Parental Consent Form, there is no further
action taken.
Appendix R
206
207
Task Date
11. Consent form/permission slip returned for May 13th -May 16th, 2003
participation in computer training
13. Begin 1st week of training with experimental Mal 19t\ 2ot\ 21si, 22nct, and
group: 23r 2003
Operating Systems
14. Begin 2nd week of training with experimental May 26tn, 2in, 28tn, 29tn and
group: 30th' 2003
18. Meeting with Dr. Poole on Final Wrap-Up #1 June 19tn, 2003
19. Meeting with Dr. Poole and Dr. Leneway for July 3rd , 2003
Final Wrap-Up #2
209
210
Std. Error
Mean N Std. Deviation Mean
Pair POPS: Total: Pre 25.4167 12 7.26709 2.09783
1 POPS: Total: Post 28.4167 12 9.89452 2.85630
N Correlation Sig.
Pair POPS: Total: Pre &
1 POPS: Total: Post 12 .814 .001
Paired Differences
95% Confidence Interval
Std. Error of the Difference
Mean Std. Oeviation Mean Lower I Upper t
Pair POPS: Total: Pre -
1 POPS: Total: Post -3.0000 5.79969 1.67423 -6.6849 I .6849 -1.792
df Sia. /2-tailed)
Pair POPS: Total: Pre -
1 POPS: Total: Post 11 .101
Std. Error
Mean N Std. Deviation Mean
Pair : Total: Pre 6 8.24015 3.36403
1 POPS: Total: Post 6 9.74 .97841
N Correlation Sig.
Pair POPS: Total: Pre &
1 POPS: Total: Post 6 .793 .060
211
Paired Differences -
/
95% Confidence Interval
Std. Error of the Difference
II
Mean Std. Deviation Mean Lower Upper t
Pair POPS: Total: Pre -
1 POPS: Total: Post -.3333 5.95539 2.43128 -a.5831 5.9165 -.137
df Sig. (2-tailed)
Pair POPS: Total: Pre -
1 POPS: Total: Post 5 .896
Std. Error
Mean N Std. Deviation Mean
Pair POP5: Total: Pre 25.3333 6 6.94742 2.83627
1 POPS: Total: Post 31.0000 6 10.21763 4.17133
N Correlation Sia.
Pair POPS: Total: Pre &
1 POPS: Total: Post 6 .924 .008
Paired Differences
95% Confidence Interval
Std. Error of the Difference
I
I
Mean Std. Deviation Mean Lower Upper t
Pair POPS: Total: Pre -
1 POPS: Total: Post -5.6667 4.63321 1.89150 -10.5289 -.8044 ·-2.996
df Sig. (2-tailed)
Pair POPS: Total: Pre -
1 POPS: Total: Post 5 .030
212
Means
Case Processing Summary
Cases
Included Excluded Total
N Percent N Percent N Percent
POPS: Total: Pre *
GROUP * GENDER 12 100.0% 0 .0% 12 100.0%
POPS: Total: Post *
GROUP '* GENDER 12 100.0% 0 .0% 12 100.0%
Report
POPS: POPS:
GROUP GENDER Total: Pre Total: Post
C ,, F Mean 23.3333 27.0000
N 3 3
Std. Deviation 8.14453 8.54400
M Mean 27.6667 24.6667
N 3 3
Std. Deviation 9.45163 12.66228
Total Mean 25.5000 25.8333
N 6 6
Std. Deviation 8.24015 9.74508
E F Mean 29.6667 38.0000
N 3 3
Std. Deviation 7.57188 7.54983
M Mean 21.0000 24.0000
N 3 3
Std. Deviation 2.64575 7.54983
Total Mean 25.3333 31.0000
N 6 6
Std. Deviation 6.94742 10.21763
Total F Mean 26.5000 32.5000
N 6 6
Std. Deviation 7.84219 9.39681
M Mean 24.3333 24.3333
N 6 6
Std. Deviation 7.20185 9.33095
Total Mean 25.4167 28.4167
N 12 12
Std. Deviation 7.26709 9.AQ452
Std. Error
Mean N Std. Deviation Mean
Pair PRE_1.,;uNT 25.5000 6 8.24015 3.36403
1 PRE_EXP 25.3333 6 6.94742 2.83627
Pair POST_CON 25.8333 6 9.74508 3.97841
2 POST EXP 31.0000 6 10.21763 4.17133
213
N Correlation Sio.
Pair1 PRE_CONT &PRE_EXP 6 -.583 .224
Pair2 POST CON &
POST-EXP 6 -.538 .271
Paired Differences
95% Confid ence Interval
Std. Error of the Difference
Mean Std. Deviation Mean Lower Upper
Pair1 PRE_CONT-PRE_EXP .1667 13.52652 5.52218 -14.0285 14.3619
Pair2 POST CON-
-5.1667 17.50905 7.14804 -23.5413 13.2079
POST-EXP
t df Sio. (2-tailed)
Pair 1 PRE_�uNT-PRE_EXP .030 5 .977
Pair2 POST CON-
-.723 5 .502
POST-EXP
Appendix T
214
215
OVERVIEW OF TECHNOLOGY
CONTENT STANDARDS
All students will:
Use and transfer technological knowledge and skills for Using and
life roles (family member, citizen, worker, consumer, Transferring
lifelong learner);
More detailed information concerning the technology standards can be found at:
http://www.michigan.gov/documents/Technology 11594 7.htm
Standard 1.2 Variability and Change
216
Students describe the relationships among variables, predict what will happen to one variable as another
variable is changed, analyze natural variation and sources of variability, and compare patterns of change.
Variability and change are as fundamental to mathematics as they are to the physical world, and an
understanding of the concept of a variable is essential to mathematical thinking. Students must be able to
describe the relationships among variables, to predict what will happen to one variable as another variable
is changed, and to compare different patterns of change. The study of variability and change provides a
basis for making sense of the world and of mathematical ideas.
Knowing what data to collect and where and how to collect them is the starting point of quantitative
literacy. The mathematics curriculum should capitalize on students' natural curiosity about themselves and
their surroundings to motivate them to collect and explore interesting statistics and measurements derived
from both real and simulated situations.
Once the data are gathered, they must be organized into a useful form, including tables, graphs, charts and
pictorial representations. Since different representations highlight different patterns within the data,
students should develop skill in representing and reading data displayed in different formats, and they
should discern when one particular representation is more desirable than another.
Based on known data, students should be able to draw defensible inferences about unknown outcomes.
They should be able to make predictions and to identify the degree of confidence that they place in their
predictions.
Mathematical representations allow us to visualize and understand problems. These representations may
be numerical, literal, symbolic, graphical, pictorial or physical. Facility with multiple representations of
numerical and algebraic concepts and relationships is essential to mathematical competence. This includes
the development of "symbol sense" as well as "number sense" and the understanding that the notion of
solution involves a process as well as a product. Thus, the solution of a mathematical problem requires
both an understanding of the question for which an answer is sought and the development of a strategy to
obtain that answer. The context of the problem determines the nature and the degree of precision of the
required solution. The increasing use of quantitative methods in all disciplines has made algebra the
fundamental tool for mathematical applications. Algebraic thinking is learned most effectively when it is
studied in the context of applications, both mathematical and real-world, that reveal the power of algebra
to model real problems and to generalize to new situations. Students should use algebraic techniques to
analyze and describe relationships, to model problem situations, and to examine the structure of
mathematical relationships. The algebra curriculum should employ contemporary technology, including
spreadsheets and graphical analysis, to emphasize conceptual understanding of algebra and analytic
thinking as sophisticated means of representation and as powerful problem-solving tools.
218
219
••
•• ••••• • • •••
••• ••
• •
••••••••••••
The "dot square" problem
Figure 2.la
some way: If there were a total of 76 dots, how many would be on each
side of the square? Could a square be formed with a total of 75 dots?
Students could also work with extensions involving dots on the
perimeter of other regular polygons. By extending problems and asking
different questions, students become problem posers as well as problem
solvers.
.••
�I
•·• · ·· •·•·,�r;
,.... •••
•
•
•• ••
• ••
•cf••·••°!),• -
.:. ........-
•r.-------J'
4x8+4-36 4x10-4-36 10+8+10+8""36
Appendix V
221
222
Thank you toc youz request. By all ae:ans, ple:iu,e: fe:e:l fi:ee: to aodel youc
coaputet curcicu.1ua a!tet W.e lesson., on Kids Doaa.in • Ple�e include
acknovledc;;r:aent ot our vebsite 1t you ceprint and handout any ot our content.
http://www.kaboose.can
Toe Kaboose Jletwo:c:k - �t On Baaed!
Fun!chool. coa - Kid!:do:aain. coa - Zeelr:s. coa
Appendix W
223
224
I lost a file! I saved a file, but I can't remember what I called it.
I know the title at the top of the page was:
Lost Dog
Station #1
Team #2: First it wasn't plugged in and the USB wasn't plugged in. There was no paper.
Team #3: "What's wrong with your printer is the plug wasn't in and there was no paper.
Station #2
Team #1: "Plug in any plugs, put in CD and pushed yes, You put the CD in and clicked
Team #2: "One problem we had was the mouse was not plugged in. We first went to
installer then we pushed continue and it installed them. We restarted the computer."
Station #3
Team #1: "Monitor won't tum on because it had no power and it was not plugged in to
Team #2: "The monitor is not working. It is not working because the [monitor] is not
Team #3: "[The problem is the] monitor won't tum on because the power cable isn't
Station #4
Team #1: "The sound cable is not plugged in. The ram is missing. The trackball for the
mouse isn't in. Plug in [the] mouse, keyboard and monitor. The power cable is not
plugged in.
Team #2: "The ram is missing (1). The wire is not plugged in. The p5 ( the internal
power cord) wire is not plugged in. The [monitor] is not plugged into the power tower.
226
The mouse is not plugged in. The trackball is missing. The power button is not there.
Team #3: "Sound cable. More ram. Trackball. Plug in mouse and keyboard and
Station #6
Team #2: "Yes we found it. We went to search and pushed files and folders then we
Team #1: "You right click anywhere then you click properties. Then you go to desktop
Team #2: "Yes. We right clicked then we went to properties. Then we clicked on