0% found this document useful (0 votes)
29 views248 pages

Effects of Computer Troubleshooting On Elementary Students Probl

Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
29 views248 pages

Effects of Computer Troubleshooting On Elementary Students Probl

Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 248

Western Michigan University

ScholarWorks at WMU

Master's Theses Graduate College

8-2003

Effects of Computer Troubleshooting on Elementary Students'


Problem Solving Skills
Anne Todd Ottenbreit

Follow this and additional works at: https://scholarworks.wmich.edu/masters_theses

Part of the Elementary Education Commons, and the Instructional Media Design Commons

Recommended Citation
Ottenbreit, Anne Todd, "Effects of Computer Troubleshooting on Elementary Students' Problem Solving
Skills" (2003). Master's Theses. 3372.
https://scholarworks.wmich.edu/masters_theses/3372

This Masters Thesis-Open Access is brought to you for


free and open access by the Graduate College at
ScholarWorks at WMU. It has been accepted for inclusion
in Master's Theses by an authorized administrator of
ScholarWorks at WMU. For more information, please
contact [email protected].
EFFECTS OF COMPUTER TROUBLESHOOTING ON ELEMENTARY
STUDENTS' PROBLEM SOLVING SKILLS

by

Anne Todd Ottenbreit

A Thesis
Submitted to the
Faculty of The Graduate College
in partial fulfillment of the
requirements for the
Degree of Master of Arts
Department of Educational Studies

Western Michigan University


Kalamazoo, Michigan
August 2003
EFFECTS OF COMPUTER TROUBLESHOOTING ON ELEMENTARY
STUDENTS' PROBLEM SOLYING SKILLS

Anne Todd Ottenbreit, M.A.

Western Michigan University, 2003

The lack problem solving skills exhibited by students has generated concerns

at national and state levels of education (Coleman, King, and Ruth, 2001). If the

educational technology curriculum involved computer troubleshooting, students could

possibly increase their problem solving abilities. Because computer troubleshooting

follows similar procedures to problem solving, there is possibility of an educational

transfer and could be easily included into the educational technology curriculum. The

purpose of the research study was to discover if the computer troubleshooting

curriculum designed by the researcher affected the elementary students' problem

solving abilities. Technology education at the elementary level includes keyboarding

and 'practice and drill' software. The proposed curriculum would be a new method to

meet national technology and math standards of education. Positive outcomes of the

research will validate its use in a technology education program.


Copyright by
Anne Todd Ottenbreit
2003
ACKNOWLEDGMENTS

The process ofmy master's degree was far more difficult than I had

anticipated. I was very fortunate to have three excellent committee members to


collaborate with throughout that process. First, I would like to thank Dr. Poole. The

experience, wisdom, support, understanding and friendship were extremely helpful

and very appreciated all the way through the process. I have benefited educationally

from having Dr. Poole as my advisor and chair ofmy committee. I am also grateful to

Dr. Bosco, who was especially helpful in challenging my thoughts and ideas in order

to produce a better product. I am also grateful to Dr. Leneway who was especially
helpful and flexible throughout the entire process.

I would like to also thank both ofmy parents who offered, love and support

throughout my entire Master's process. I would especially like to thank my mother

for her endless proofreading and assistance with my master's research project. My
father encouraged me to pursue my topic ofinterest and assisted in the design process

with great passion. My grandparents, sister, brother and the rest ofmy entire family

were very supportive and helpful throughout my master's degree.

I would also like to thank Dr. Newlin-Haus for all ofher assistance with my

research design, statistical help and instruction on research methods. The HSIRB
sub-committee was extremely instrumental in designing my research design. Mary­

Jane Mielke was extremely supportive in my educational technology development

throughout the year, and inspired me to become a better educator.

I was very fortunate to conduct my research project at Haigh Elementary


School. I would also like to thank Kids Domain.Com and CompTIA for assistance in
11
Acknowledgments-continued

curriculum. Western Michigan University was extremely helpful and provided three

computers for the study.


I would lastly like to thank my best friend Luke for making my life happier,
simpler and less stressful through the entire process.

Anne Todd Ottenbreit

111
TABLE OF CONTENTS

ACKNOWLEDGMENTS ...................................................................................... u

LIST OF TABLES .................................................................................................. Xlll

LIST OF FIGURES ................................................................................................. XVll

CHAPTER

I. INTRODUCTION TO THE PROBLEM................................................ 1

Introduction ............................................................................................. 1

General Statement of the Problem .......................................................... 2

Research Questions and Hypothesis ....................................................... 2

Research Hypothesis ...................................................................... 2

Major Research Question ............................................................... 2

Minor Research Questions and Hypothesizes ................................ 2

Question and Hypothesis 1 ................................................ 2

Question and Hypothesis 2 ................................................ 3

Question and Hypothesis 3 ................................................ 3

Question and Hypothesis 4 ................................................ 4

Question and Hypothesis 5 ................................................ 4

Question and Hypothesis 6 ................................................ 5

Question and Hypothesis 7 ................................................ 5

Purpose of the Study ............................................................................... 6

IV
List of Tables - continued

Background ..............................................................................................7

Summary ..................................................................................................7

II. REVIEW OF LITERATURE ...........................................................................9

Introduction.......................................... : ...................................................9

Problem Solving Importance ................................................................... 9

Mathematical Problem Solving .............................................................10

Michigan Curriculum Mathematical Standards ............................11

Typical Fifth Grade Problem Solving ...........................................12

Various Math Problem Solving Methods .....................................12

Problem Solving and Hands-On Leaming .............................................13

Problem Solving and Authentic Leaming .............................................14

Transfer of Problem Solving Skills .......................................................14

Introduction ...................................................................................14

Scientific Inquiry ..........................................................................15

Mathematics..................................................................................15

Reading Recovery .........................................................................15

Computer Programming and Computer-Based Simulations ........15

Technology Education ...........................................................................16


List of Tables - continued

Groups and Learning in Technology ............................................18

Technology and Hands-On Learning .....................................................20

Technology and Authentic Leaming .....................................................18

Problem Solving and Computer Troubleshooting .................................19

III. METHODOLOGY .........................................................................................21

Introduction............................................................................................22

Research Setting ....................................................................................22

Introduction...................................................................................22

Haigh Elementary School Demographics and Area .....................22

School Information .......................................................................23

Curriculum ....................................................................................24

Technology Curriculum ....................................................24

Math Curriculum ..............................................................25

Research Design ....................................................................................28

Computer Troubleshooting Curriculum ................................................29

A+ Curriculum ..............................................................................30

Kids Domain Computer Lessons ..................................................32

Instructional Method .....................................................................32

Subjects..................................................................................................33

Criteria for Students in the Control Group ...................................34

Criteria for Students in the Experimental Group ..........................35

Vl
List of Tables - continued

Variables................................................................................................ 35

Dependent Variables..................................................................... 35

Treatment .........................................................................35

Subject Selection Factors..................................................35

Attendance ............................................................ 35

Problem Solving Skills ......................................... 36

Gender................................................................... 36

Independent Variables .................................................................. 38

Testing Procedures............................................................ 38

ETS Testing Services........................................................40

IOWA Math Tests.............................................................40

POPS Problem Solving Test.............................................42

Hands-On Problem Solving Tests ....................................45

Survey ...............................................................................47

Group Interview ................................................................48

Troubleshooting Activity..................................................49

Data Analysis.........................................................................................50

Hypothesis 1 .................................................................................50

Hypothesis 2 .................................................................................51

Vll
List of Tables - continued

Hypothesis 3 .................................................................................51

Hypothesis 4 .................................................................................52

Hypothesis 5 .................................................................................52

Hypothesis 6 .................................................................................52

Hypothesis 7 .................................................................................53

Summary ................................................................................................53

A. FINDINGS ........................................................................................................55

Introduction............................................................................................55

Explanation of Student's Profile ............................................................55

Profile of Each Student's Assessment Data...........................................58

Female ...........................................................................................58

Student 1 Control ..............................................................58

Student 2 Experimental.....................................................60

Student 3 Control ..............................................................62

Student 4 Experimental.....................................................64

Student 5 Control ..............................................................66

Student 6 Experimental.....................................................68

Male .............................................................................................. 70

Student 7 Control .............................................................. 70

Student 8 Experimental.....................................................72

Student 9 Control.............................................................. 74

Vlll
List of Tables - continued

Student 10 Experimental...................................................76

Student 11 Control............................................................78

Student 12 Experimental...................................................80

Description of Findings Pertinent to Hypothesis...................................81

Hypothesis l ...................................................................................81

Troubleshooting Activity..................................................82

Station #1 ....................................................................83

Station #2 ....................................................................84

Station #3 ....................................................................85

Station #4 ....................................................................85

Station #5 ....................................................................87

Station #6 ....................................................................87

Station #7 ....................................................................88

Group Interview ..........................................................89

Hypothesis 2 ....................................................................................91

POPS- Profiles of Problem Solving Test..........................91

Hands-on Problem Solving Test.......................................93

Hypothesis 3 ....................................................................................98

POPS- Profiles of Problem Solving Test..........................98

Survey/Group Interview....................................................99

Hypothesis 4 ..................................................................................101

lX
List of Tables - continued

IOWA Test...................................................................... 101

Group Interview .............................................................. 102

Hypothesis 5 ..................................................................................102

POPS- Profiles of Problem Solving Test........................103

Hands-on Problem Solving Test .....................................109

Hypothesis 6 .................................................................................. 112

POPS- Profiles of Problem Solving Test........................113

Hands-on Problem Solving Test.....................................117

Hypothesis 7120

POPS- Profiles of Problem Solving Test........................121

Hands-on Problem Solving Test.....................................125

Group Interview ..............................................................129

V. CONCLUSION AND RECOMMENDATIONS..........................................134

Introduction ..... ..................................................................................134

Summary of the Study .........................................................................135

Summary of the Research Problem.............................................135

Summary of the Methods............................................................135

Summary of the Findings............................................................136

Hypothesis 1 ...................................................................136

Hypothesis 2 ................................................................... 137

Hypothesis 3 ...................................................................139

X
List of Tables - continued

Hypothesis 4 ...................................................................139

Hypothesis 5 ...................................................................140

Hypothesis 6 ...................................................................141

Hypothesis 7 ...................................................................143

Conclusions...... .................................................................................144

Conclusion of Hypothesis 1 ........................................................144

Conclusion of Hypothesis 2 ........................................................146

Conclusion of Hypothesis 3 ........................................................147

Conclusion of Hypothesis 4 ........................................................148

Conclusion of Hypothesis 5 ........................................................149

Conclusion of Hypothesis 6 ........................................................150

Conclusion of Hypothesis 7 ........................................................151

Recommendations for Further Research .............................................153

Final Conclusions ................................................................................157

BIBLIOGRAPHY ................................................................................................159

APPENDICES

A. Human Subjects Institutional Review Board .......................................163

B. Student Assent Form............................................................................165

C. Parental Permission Slip (Letter A) .....................................................167

D. Parental Consent Form (Letter B) ........................................................170

E. Additional Information Requested By Parents ....................................172

Xl
L ist of Tables - continued

F. Cover Sheets for Tests......................................................................... 175

G. Survey ................................................................................................. 177

H. Interview.............................................................................................. 180

I. Code Sheet........................................................................................... 182

J. Observation Rubric of Hands-On Problem Solving Test .................... 184

K. Group Interview and Survey Response................................................187

L. Survey and Interview Rubric ............................................................... 192

M. IOWA Test Review ............................................................................. 194

N. POPS Test Review............................................................................... 197

0. Hands-On Problem Solving Test......................................................... 199

P. Curriculum for Computer Troubleshooting Training Sessions ........... 202

Q. Procedure to Obtain Consent............................................................... 204

R. Master's Thesis Timeline .................................................................... 206

S. Paired Samples T-Test Results............................................................ 209

T. Michigan Curriculum F rameworks...................................................... 214

U. Typical 5 th Grade Problem Solving by N TCM.................................... 218

V. Email Documentation from K ids Domain. com................................... 221

W. Spreadsheet of Answers for Worksheets and Worksheet Samples......223

Xll
LIST OF TABLES

3.1. Pre-Testing and Post-Testing Organization ............................................53

4. l a. Example Table of the Columns Explained in Table A ...........................56

4.1b. Example Table of the Columns Explained in Table B ...........................56

4. l c. Example Table of the Columns Explained in Table C ...........................57

4.l d. Example Table of the Columns Explained in Table D ...........................58

4.2a. Student Identification and IOWA Results ..............................................59

4.2b. Easy Hands-On Problem Solving Results ...............................................59

4.2c. Difficult Hands-On Problem Solving Results.........................................59

4.2d. POPS- Profiles of Problem Solving ........................................................60

4.3a. Student Identification and IOWA Results ..............................................61

4.3b. Easy Hands-On Problem Solving Results...............................................61

4.3c. Difficult Hands-On Problem Solving Results.........................................61

4.3d. POPS- Profiles of Problem Solving ........................................................62

4.4a. Student Identification and IOWA Results ..............................................63

4.4b. Easy Hands-On Problem Solving Results...............................................63

4.4c. Difficult Hands-On Problem Solving Results.........................................63

4.4d. POPS- Profiles of Problem Solving ........................................................64

4.5a. Student Identification and IOWA Results ..............................................65

4.5b. Easy Hands-On Problem Solving Results ...............................................65

4.5c. Difficult Hands-On Problem Solving Results.........................................65

4.5d. POPS- Profiles of Problem Solving ........................................................66

Xlll
List of Tables - continued

4.6a. Student Identification and IOWA Results .............................................67

4.6b. Easy Hands-On Problem Solving Results ...............................................67

4.6c. Difficult Hands-On Problem Solving Results.........................................67

4.6d. POPS- Profiles of Problem Solving ........................................................68

4.7a. Student Identification and IOWA Results ..............................................69

4.7b. Easy Hands-On Problem Solving Results ...............................................69

4.7c. Difficult Hands-On Problem Solving Results.........................................69

4.7d. POPS- Profiles of Problem Solving ........................................................70

4.8a. Student Identification and IOWA Results ..............................................71

4.8b. Easy Hands-On Problem Solving Results...............................................71

4.8c. Difficult Hands-On Problem Solving Results.........................................71

4.8d. POPS- Profiles of Problem Solving ........................................................72

4.9a. Student Identification and IOWA Results ..............................................73

4.9b. Easy Hands-On Problem Solving Results ...............................................73

4.9c. Difficult Hands-On Problem Solving Results.........................................73

4.9d. POPS- Profiles of Problem Solving ........................................................74

4.10a. Student Identification and IOWA Results ..............................................75

4.1Ob. Easy Hands-On Problem Solving Results ...............................................75

4.10c. Difficult Hands-On Problem Solving Results.........................................75

4.10d. POPS- Profiles of Problem Solving ........................................................76

4.11a. Student Identification and IOWA Results ..............................................77

4.1 lb. Easy Hands-On Problem Solving Results ...............................................77

4.1 lc. Difficult Hands-On Problem Solving Results.........................................77


XIV
List of Tables - continued

4.1 l d. POPS- Profiles of Problem Solving ........................................................78

4.12a. Student Identification and IOWA Results ..............................................79

4.12b. Easy Hands-On Problem Solving Results ...............................................79

4.12c. Difficult Hands-On Problem Solving Results.........................................79

4.12d. POPS- Profiles of Problem Solving ........................................................79

4.13a. Student Identification and IOWA Results ..............................................80

4.13b. Easy Hands-On Problem Solving Results...............................................81

4.13c. Difficult Hands-On Problem Solving Results.........................................81

4.13d. POPS- Profiles of Problem Solving ........................................................81

4.14. Computer Troubleshooting Activity Results ..........................................82

4.15. Excerpt From Group Interview - Transcribed Conversation..................90

4.16. Control Group Methods Used Section ....................................................91

4.17. Experimental Group Methods Used Section...........................................92

4.18. Time Results from the Easy Problem


in the Hands-on Pre- and Post-Test ........................................................95

4.19. Control Students' Pre-Test and


Post-Test Ability to Solve the
Difficult Problem in the Hands-On
Problem Solving Test..............................................................................96

4.20. Experimental Students' Pre-Test


and Post-Test Ability to Solve the
Difficult Problem in the Hands-On
Problem Solving Test..............................................................................96

4.21. Each Student's Pre-Test Score on the


POPS test Graded on Beginning, Developing
or Advanced Levels of Achievement......................................................99

xv
List of Tables - continued

4.22. Student Responses to the Most


Difficult Problem Solving Process........................................................100

4.23. Mean of POPS Pre-Test vs. Post-Test .


Total Score Comparing Females vs. Males ..........................................103

4.24. Comparison of the Methods Used and Extracting


Information Categories of the POPS Test Between
Mean Gender Score...............................................................................104

4.25. Gender Comparisons Divided by Group of


Total POPS Score on the Pre- and Post-Tests ......................................105

4.26. Time Completion Compared by Problem Solving


Ability for the Easy Problem in the Hands-On
Problem Solving Pre-Test versus Post-Test..........................................118

4.27. Percentage of Students within their Teacher-Rated


Problem Solving Ability Groups who solved the
difficult problem in the hands-on pre-test
versus the post-test ................................................................................119

4.28. POPS Correctness of Answer, Accuracy and


Quality of Explanation Categories Pre-Test
and Post-Test Scores Compared Between Ability Groups ...................124

4.29. Results from Easy Problem in Hands-on Problem


Solving Test Comparing between Groups ............................................ 126

4.30. Average Number of Attempts in Control Group


versus Experimental Group and Percentage of Each
Group Able to Correctly Solve the Difficult Problem ..........................128

4.31. Excerpt from Group Interview - Transcribed Conversation.................131

XVI
LIST OF FIGURES

2.1. The Problem Solving Processes Used as Standard


for the Purposes of this Research Study ...................................................12

3.1. The 48128 Zip Code Dearborn Ethnic Representation ............................23

3.2. Students Ethnic Background at Haigh Elementary School ......................24

3.3. The Dearborn School District's Current Technology Curriculum ..........26

3.4. Timeline of the Training Sessions ...........................................................29

3.5. Primary Objectives from the CompTIA A+ Curriculum.........................31

3.6. POPS Test Results Compared Between


POPS Administrative Test and the Computer
Troubleshooting Study Students...............................................................37

3.7. ETS Services Disclaimer .........................................................................41

3.8. IOWA Test ofBasic Skills Review.........................................................42

3.9. POPS - Profiles of Problem Solving Review .........................................46

4.1. Control Group Methods Used Section Graphed.......................................92

4.2. Experimental Group Methods Used Section Graphed .............................93

4.3. Comparing Differences of Groups with


Number of Attempts in the Hands-On Pre-Test
versus Post-Test........................................................................................94

4.4. Time Results from the Easy Problem in the


Hands-on Pre- and Post-Test ....................................................................95

4.5. Comparing the Number of Students in Each Group


not Able to Solve Correctly or Solve Incorrectly
the Difficult Problem in the Hands-On Problem Solving Test.................97

4.6. Comparing the Number of Students in Each Group


Able to Correctly Solve the Difficult Problem in
the Hands-On Problem Solving Test........................................................97

xvn
List of Figures-continued

4.7. IOWA Scores Compared Between Mean Group


Scores on the Pre-Test versus the Post-Test ........................................... 101

4.8. Mean of POPS Pre-Test vs. Post-Test Total


Score Comparing Females vs. Males ..................................................... 104

4.9. Comparison of the Methods Used and Extracting


Information Categories of the POPS
Test Between Mean Gender Score .........................................................105

4.10. Percentage Improvement Between the


Pre-Assessment and Post-Assessment
of the Total POPS Score Separated
by Gender and Group .............................................................................106

4.11. Difference Between POPS Pre-Assessment


and Post-Assessment Score on the
Correctness of Answer Category
Separated by Gender...............................................................................107

4.12. Difference Between POPS Pre-Assessment


and Post-Assessment Score on the Accuracy
Category Separated by Gender ...............................................................107

4.13. Difference Between POPS Pre-Assessment


and Post-Assessment Score on the Quality of
Explanation Category Separated by Gender...........................................108

4.14. Experimental Males versus Control Males Time


to Complete the Easy Problem in the
Hands-On Pre-Test and Post-Test ..........................................................109

4.15. Experimental Females versus Control Females


Time to Complete the Easy Problem in the
Hands-On Pre-Test and Post-Test ..........................................................110

4.16. Difference in Number of Attempts Between the


Easy Problem in the Hands-On Pre-Test and
Post-Tests Separated by Groups and Gender ......................................... 111

XVlll
List of Figures-continued

4.17. Difference in Number of Attempts Between the


Difficult Problem in the Hands-On Pre-Test and
Post-Tests Separated by Groups and Gender .........................................111

4.18. POPS Total Score Pre-Test and Post-Test Divided


by Teacher Rated Problem Solving Ability and Groups ........................114

4.19. POPS Total Score Pre-Test and Post-Test High


Teacher Rated Problem Solving Ability
Separated by Groups...............................................................................114

4.20. POPS Total Score Pre-Test and Post-Test


Medium Teacher Rated Problem Solving
Ability Separated by Groups ..................................................................115

4.21. POPS Total Score Pre-Test and Post-Test


Low Teacher Rated Problem Solving Ability
Separated by Groups...............................................................................116

4.22. Time Completion Compared by Problem Solving


Ability for the Easy Problem in the Hands-On
Problem Solving Pre-Test versus Post-Test ...........................................117

4.23. Number of Attempts on the Easy Problem in the


Hands-on Pre-Test versus Post...............................................................119

4.24. Difference in Total POPS Score Between the Pre-Test


and Post-Test of Average Experimental Group versus
Average Control Group ..........................................................................121

4.25. Comparing the Difference in POPS Extracting


Information Section Pre-Test and Post-Test Between Groups ...............123

4.26. Comparing Completion of the Easy Hands-on


Problem Time Improvements Between
Problem Solving Ability Levels .............................................................126

4.27. Comparing the Percentage of Students in the


Control Group versus the Experimental Group
Able to Solve the Difficult Problem in the
Pre-Test and Post-Test............................................................................127

XIX
CHAPTER 1

INTRODUCTION TO THE PROBLEM

Introduction

The lack of higher level thinking skills used by students has become an area of

concern at national and state levels of education (Coleman, King, and Ruth, 2001).

Computer troubleshooting contains similar characteristics to problem solving. A

technology curriculum encompassing computer troubleshooting has the possibility to

enhance the educational technology curriculum, while increasing problem solving

ability in elementary students (MacPherson, 1998). The purpose of the research study

was to establish whether the computer troubleshooting curriculum designed by the

researcher had the ability to affect elementary students' problem solving ability.

Technology education at the elementary level has primarily consisted of keyboarding

and 'practice and drill' software. Computer troubleshooting and technology have

lacked integration at the elementary level (Poris, 2000). Evidence acquired from this

research will lend credence to the possible incorporation of this type of training

program in order to enhance the learning experience and technology education

program.

1
2
General Statement of the Problem

Students are currently lacking adequate learning opportunities in problem

solving (Coleman, et al, 2001; Jonassen, 2000). Providing students content

knowledge is important in the present, but providing students with problem-solving

skills is essential for the future. The importance of problem solving is illustrated in

the quote, "Give a man a fish; you have fed him for today. Teach a man to fish; and

you have fed him for a lifetime"-Author unknown. By teaching students how to

solve problems as opposed to supplying them with content knowledge, students will

be able to successfully solve life problems, and therefore be successful in life (Casey

& Tucker, 1994).

Research Questions and Hypothesis

Research Hypothesis

The use of a computer troubleshooting curriculum as a model for problem

solving will improve elementary students' problem solving abilities.

Major Research Question

Will using a computer troubleshooting curriculum improve elementary students'

problem solving abilities?

Minor Research Questions and Hypothesizes

Question and Hypothesis 1

Can elementary students who participate in a computer troubleshooting

curriculum develop the ability to solve common computer problems?


3
Elementary students who participate in a computer troubleshooting curriculum

will develop the ability to solve common computer problems by participating in

computer troubleshooting trainings.

Students have the ability to retain large amounts·ofknowledge at the

elementary level. Information retrieved from the computer troubleshooting activity

and team worksheets were evaluated to establish whether students were able to solve

common computer problems. The group interview also provided additional

information from transcribed conversation excerpts.

Question and Hypothesis 2

Can elementary students who participate m the computer troubleshooting

curriculum improve their problem solving methods?

Elementary students who participate in the computer troubleshooting

curriculum will improve problem solving methods.

This hypothesis was evaluated through the comparison of the POPS Methods

Used section pre-test and post-test results. The hands-on problem solving test was

also analyzed to evaluate whether problem solving methods improved.

Question and Hypothesis 3

What aspect ofproblem solving is the most difficult for elementary students?

The most difficult procedure in problem solving for elementary students will

be to understand what the question is looking for.


4
The most difficult procedure in problem solving was initially analyzed

through a review of literature and articles. Data was collected through the Profiles of

Problem Solving test and the surveys/interviews in order to analyze the most difficult

procedure. The standardized problem solving test, POPS, divided the evaluation into

five separate categories; Correctness of Answer, Methods Used, Accuracy, Extracting

Information and Quality of Explanation. Pre-test scores on each of these categories

were compared to find the weakest area. Data was also collected from the

Survey/Group Interview, providing information on students' opinions of the most

difficult procedures in the problem solving process.

Question and Hypothesis 4

Can elementary students who participate m a computer troubleshooting

curriculum increase mathematical problem solving ability?

Elementary students who participate in a computer troubleshooting curriculum

will increase mathematical problem solving ability.

The IOWA Basic Skills Test and the Group Interview were used to assess

mathematical problem solving ability. Comparisons between the pre-test and post­

test IOWA scores of each group were analyzed. Responses from the group interview

pertaining to math were also collected for evaluation.

Question and Hypothesis 5

Does gender impact the problem solving ability of elementary students

involved in a computer troubleshooting curriculum?


5
Gender will not impact the problem solving ability of elementary students

involved in a computer troubleshooting curriculum.

Gender differences were compared through the POPS total scores, individual

categories within the POPS test and the hands-on problem solving test. Comparisons

between genders and between the experimental and control groups of genders were

analyzed using pre-test scores and post-test scores.

Question and Hypothesis 6

Does teacher-rated problem solving ability impact the problem solving ability

of elementary students involved in a computer troubleshooting curriculum?

Students who participate in a computer troubleshooting curriculum rated by

teachers as having high problem solving ability will demonstrate greater

improvements in problem solving ability.

Students in the experimental group and control group were matched based on

teacher-rated problem solving ability. When students were divided into teacher-rated

problem solving groups, the results presented a different angle on improvements.

Problem solving ability groups were compared through the POPS total scores,

individual categories within the POPS test and the hands-on problem solving test.

Question and Hypothesis 7

Can elementary students who participate m a computer troubleshooting

curriculum demonstrate greater improvements in problem solving ability than

students who did not participate in the program?


6
Elementary students who participate in a computer troubleshooting curriculum

will demonstrate greater improvements in problem solving ability than students who

did not participate in the program.

The results from the POPS test, the hands-on problem solving test and the

group interview were evaluated in order to establish whether students in the

experimental group achieved higher results in the post assessment than the students in

the control group. Data was also collected and analyzed from each section of the

POPS test.

Purpose of the Study

The mam purpose of this project was to examme the problem-solving

requirements necessary in computer repair and troubleshooting, and its effect on

academic achievement and problem solving. Leaming activities such as computer

LOGO programming have the ability to develop problem solving skills and improve

academic achievement (Kurland, 1986). This project used computer repair and

troubleshooting as a model for teaching problem solving strategies. The research

study proposed to increase problem solving abilities and academic achievement

through computer troubleshooting and repair training. With this research, elementary

schools may incorporate the training program into the educational technology

program in order to enhance the learning. This program has the potential to inform

the educational community of whether computer troubleshooting had an effect on

problem solving skills. The training session, if proved effective, could provide a new

addition to technology education programs at the elementary level.


7
Background

At this time of this study, no current research on the proposed topic was

found. Computer troubleshooting and repair had been attempted at the high school

level with great success, but no relationship to problem solving skills had been tested

or proposed. With the present technical focus of today, students need to have the

ability to solve everyday problems with technology. The researcher believes that

technology education could include computer problem solving as a means of meeting

two separate parts of the curriculum; the students would be improving their problem

solving skills, while preparing for the future and learning an authentic, meaningful

lesson. Therefore, the research would establish baseline data for instructional

practices, and further research in this area of education.

Summary

The lack problem solving skills exhibited by students has generated concerns

at national and state levels of education (Coleman, King, and Ruth, 2001). If the

educational technology curriculum involved computer troubleshooting, students could

possibly increase their problem solving abilities. Because computer troubleshooting

follows similar procedures to problem solving, there is possibility of an educational

transfer and could be easily included into the educational technology curriculum. The

purpose of the research study was to discover if the computer troubleshooting

curriculum designed by the researcher affected the elementary students' problem

solving abilities. Technology education at the elementary level includes keyboarding


8
meet national technology and math standards of education. Positive outcomes of the

research will validate its use in a technology education program.


CHAPTER II

REVIEW OF LITERATURE

Introduction

"Your problem may be modest; but if it challenges your curiosity and

brings into play your inventive faculties, and if you solve it by your

own means, you may experience the tension and enjoy the triumph of

discovery. Such experiences at a susceptible age may create a taste for

mental work and leave their imprint on mind and character for a

lifetime" (Wilson, Fernadez & Hadaway, 1993, Ch. 4).

Problem Solving Importance

The purpose of education is to prepare students for success in life. Problem

solving skills allow students the opportunity to solve problems that arise in everyday

situations. Therefore, problem solving skills are important to include in the

curriculum because of their application towards success and life (Jonassen, 2000; Lee

1996). Students need to develop problem solving skills in order to be a successful

individual in society and educational institutions are responsible for this preparation

(Lee, 1996).

9
10
every individual. Coleman, King, and Ruth state "by not challenging students, nor

encouraging them to use higher order thinking skills, educators underestimate their

students' abilities and delay meaningful grade-level work, as well deprive them of a

significant environment for learning" (2001, p.9-10). Currently, students are not

receiving adequate learning opportunities to problem solve (Coleman, et al, 2001).

Mathematical Problem Solving

There are many possible methods of teaching problem solving, such as

learning to read and scientific discovery. Most schools, however, make little attempt

to provide students with the assistance they need to learn a broad range of problem

solving strategies. Instead, most schools tend to use formal training in problem

solving restricted to the area of mathematics (Poris, 2000).

Mathematics is synonymous with problem solving in education. An effortless

method most teachers use to integrate problem solving into the curriculum has been

through the creation of story problems. The National Council of Teachers of

Mathematics (1989) presents problem solving as one of the most vital skills for

students. The New Jersey Board of Education believes "problem posing and problem

solving involve examining situations that arise mathematics and other disciplines and

in common experiences, [and] by developing their problem solving skills, students

will come to realize the potential usefulness of mathematics in their lives" (as

reported in Poris, 2000, p.1 ). Problem solving in mathematics should not be limited

to traditional word problems, but should be taught through methods of inquiry and

application in order to expose the students to the multiple facets of problem solving.
11
Mathematical problem solving follows the same requirements for basic problem

solving; defining the problem, gathering the relevant information, establishing a

strategy or plan, canying out the plan, and reflecting on the process. As reported in

multiple studies, when students are able to perfect this process in mathematics, they

will be able to apply the same process to other problems with success (Paris, 2000).

Michigan Curriculum Mathematical Standards

The Michigan Curriculum Frameworks are curriculum standards developed by

parents, educators, business leaders and university professors. The frameworks listed

descriptions of learning objectives students should achieve in subject areas. The

mathematics section states the following:

"A mathematically powerful individual should be able to: reason

mathematically; communicate mathematically; problem solve using

mathematics; and, make connections within mathematics and between

mathematics and other fields" (Michigan Department of Education,

2003).

Problem solving is a key benchmark the Michigan Department of Education

expects students to meet. A list of the Michigan Curriculum Frameworks pertaining

to problem solving can be found in Appendix T.


12
Typical Fifth Grade Problem Solving Ability

The National Council of Teachers of Mathematics Students indicated that

problem solving in grades 3 through 5 should have frequent experiences with

problems that interest, challenge, and engage them in thinking about important

mathematics. The NCTM declares problem solving as a process that should develop

from mathematics and provide a framework in which concepts and skills are learned

(NCTM, 2000). The NCTM website presents various problem solving activities that

are educationally appropriate for fifth grade students and document the processes the

students utilize to solve them (See Appendix U).

Various Problem Solving Methods Used

There are multiple problem solving strategies used by students when

attempting to solve a problem. The problem solving methods that will be utilized in

this research project consists of four basic steps: (a) understanding the problem; (b)

planning a solution; (c) solving the problem and (d) looking back.

Figure 2.1

The Problem Solving Processes Used as Standard for the Purposes of this Research
Study

HE ave you seen a similar problem before?


PROBLEM f so, how is this problem similar? How is it different?
at facts do you have?
at do you know that is not stated in the problem?
13
IPLAN A !How have you solved similar problems in the past?
SOLUTION !What strategies do you know?
lfry a strategy that seems as if it will work.
OCf it doesn't, it may lead you to one that will.
SOLVE THE !Use the strategy you selected and work the problem.
IPROBLEM
[,OOKBACK !Reread the question.
!Did you answer the question asked?
OCs your answer in the correct units?
!Does your answer seem reasonable?

Strategies may vary in name, however, most fall into one of the following

basic categories: (a) compute or simplify, (b) use a formula, (c) make a model or

diagram, (d) make a table, chart or list, (e) guess, check and revise, (f) consider a

simpler case, (g) eliminate and (h) look for patterns (Math Counts, 1999).

Problem Solving and Hands-On Leaming

Problem solving requires manipulation of the problem by approaching it from

various perspectives. Students should be encouraged to use multiple forms of

representation, such as symbolic and linguistic representations in order to solve

problems. Whether the problem is achieved with mental abilities or external physical

depiction, problem solving involves some form of manipulation (Jonassen, 2000;

Bodner and Domin, 2000). The ancient Chinese philosopher Lao Tzu illustrates the

influence of hands-on learning on the learning process through the statement, "What I

hear I forget, what I see I recall, what I do I know" (Waetjen, 1996).


14
Problem Solving and Authentic Leaming

Ill-structured problems are the everyday problems individuals encounter on a

regular basis and are the most difficult to prepare students to solve. The solutions are

neither easy nor predictable, requiring the individual to use multiple processes in

order to resolve them. The best way to prepare students for ill-structured problems is

to equip them with problem solving skills through the practice of well-structured

problems. Students need to utilize, implement and apply problem solving strategies

and content knowledge in order to solve familiar everyday problems (Howard, McGee

& Shin, 2001). Real world situations allow students to develop a profound

understanding of substance area learning. Problem-based learning environments have

shown higher results in achievement than the traditional curriculum environments

(Howard, McGee & Shin, 2001). Through the provision of real-world learning

experiences, student-led investigations produce "favorable dispositions, a sense of

valuing and often a desire to learn more" (Howard, McGee & Shin, 2001, 52).

Transfer of Problem Solving Skills in Subjects to General Problem Solving

Introduction

Problem solving has been incorporated in multiple subject areas to increase

abilities in problem solving, and in specific subject area. Problem solving has been

integrated into mathematics curriculums, science curriculums, as well as other

subjects areas.
15
Scientific Inquiry

Scientific inquiry emulates the problem solving process. In scientific inquiry

studies, students were required to evaluate the problem, brainstorm possible solutions

and solve the problem. Through questioning, immediate feedback and investigation,

students are able to form their own knowledge on the topic by solving the problem

(Taconis, Fergusson-Hessler, & Broekamp, 2000).

Mathematics

Through a meta-analysis, Hembree (1992) found problem solving

performance correlated with verbal achievement and mathematic achievement on

multiple levels. When mathematical problem solving was tested for transfer effects

on other subject areas, computer programming and strategy games were found to be

significant (Hembree, 1992).

Reading Recovery

Elementary curriculums have introduced such programs as Reading Recovery

to provide students with the ability to solve their own reading difficulties through a

heuristic similar to problem solving. Studies using Reading Recovery have shown

success in reading with the problem solving heuristic (Wayne & Johnstone, 1997).

Computer Programming and Computer-Based Simulations

Computer programming activities, such as LOGO, provide physical objects, in

an abstract form for students to manipulate in order to solve a problem (Hembree,

1992). Studies have also shown that computer problem-based multimedia software
16
increases understanding and the use of problem-based concepts (Sherwood, 2002).

Other computer related activities, such as web-based activities and computer

simulation activities, demonstrate an improvement in creative problem solving

achievements (Michael, 2001).

Technology Education

Technology education is rapidly becoming one of the most essential elements

m today's schools, mainly due to its ability to integrate into the educational

curriculum.

Technology Standards for Students

Standards are an important part of curriculum development. The computer

troubleshooting curriculum used in the research project was designed to meet the

ISTE Profiles for Technology Literate Students and the Michigan Curriculum

Framework Standards for Technology. The Michigan Curriculum Framework

Standards for Technology requires students to be able to apply appropriate

technologies to critical thinking, creative expression, and decision-making skills. The

frameworks also require students to employ a systematic approach to technological

solutions by using resources and processes to create, maintain, and improve products,

systems, and environments.

The ISTE, International Society for Technology in Education, organization

devised standards of technology education for elementary through high school

students. The organization divided the standards by grades levels. The performance

objectives for students include: (1) basic operations and concepts, (2) social, ethical,
17
and human issues, (3) technology productivity tools, (4) technology communications

tools, (5) technology research tools and (6) technology problem-solving and (7)

decision-making tools. Many of the performance objectives at the third through fifth

grade level demanded problem solving ability, indicating the importance of problem

solving within the technology education curriculum.

Technology Influence on Learning

Due to the recent outburst in educational technology and lack of research, a

conclusion on the influences technology has on education is yet to be established.

However, in the majority of studies reviewed that argued against positive effects of

technology (Goldman, Cole & Syer, 1999; Chaika, 1999; Wenglinsky, 1998) lacked a

curriculum or specific purpose. Computers and technology have a variety of

educational applications and uses within the curriculum. Previously, computers and

technology were seen as personal management tools, drill-and-practice machines or

graphical image producers. However, computers are now being seen as powerful

tools that can enhance and assist the learning process (Poris, 2000).

Gender and Technology

The research concerning gernder differences and technology varied infavoring

females and males. Computer achievement, attitudes, and anxiety were multiple areas

of gender differences. However, no conclusion has been widely accepted as to which

gender benefits more from technology (Burge, 2001; Hackbarth, 2002; Fey, 2001;

King, Bond & Blandford, 2002; Tsai, 2002).


18
Groups and Leaming in Technology

"Evidence indicates that when used effectively, technology can encourage

collaborative learning, development of critical thinking skills, and problem solving"

(Coleman, King, Ruth, 2001, p.10). Technology also has the ability to facilitate

questioning, feedback, reflection and revision for· students when they learn

collaboratively. Allowing students to learn collaboratively with technology creates a

scaffolding environment where the students are teaching one another (Driscroll,

2002).

Technology and Hands-On Leaming

Computers and technology engage students due to hands-on interaction. Most

students are excited to use the technology, and therefore, look forward to learning

with technical equipment (Waetjen, 1993). Students are able to demonstrate more

effort and process material at a more meaningful level when they are interested and

believe that they have the ability to solve the problem. Computer repair and

troubleshooting are perfect examples of hands-on learning experiences, allowing

students to interact with the information and receive immediate feedback (Jonassen,

2000).

Technology and Authentic Leaming

Research has documented the real-world model of student investigations leads

to productive environments and a motivation to engage in more learning experiences

(Wonacott, 2001; MacPherson, 1998). "Studies have concluded technological

problem-solving is a key tenet of higher order thinking and that technological


19
problem-solving is, by definition, rooted in real-life or authentic domains"

(MacPherson, 1998, p.5). Technology education has a real advantage in terms of

authentic learning experience because of its real-world nature. Troubleshooting has

become synonymous with problem solving due to the constant problems associated

with technology. Problem solving is process that encompasses qualities common to

both technological troubleshooting and general problem solving (MacPherson, 1998).

Problem Solving and Computer Troubleshooting

Technology troubleshooting follows the same cognitive pattern as academic

problem solving. Both troubleshooting and problem solving require documenting the

problem, brainstorming solutions, and then implementing those solutions according to

the development of the individual's own strategies. In most technical problems, the

initial problem and desired result are easily established, but the solutions to achieve

the end result are often difficult and numerous (Lee, 1996). Students frequently

overlook the important steps in problem solving. Technical problems often require

these important steps such as; multiple solutions and complete analysis of the

problem (Lavonen, Meisalo & Lattu, 2001). Technology troubleshooting requires the

ability to diagnose problems and to test out the possible solutions. Teaching problem

solving skills through technology education will enable the student to achieve

technical literacy, become skilled troubleshooters and enhance problem-solving skills

through the authentic, hands-on learning opportunities provided by technology

troubleshooting. Jonassen believes " ... troubleshooting is among the most common
20
forms of everyday problem solving" and through a computer troubleshooting

curriculum, students could learn these two skills at once (2000, p.73).
CHAPTERIII

METHODOLOGY

Introduction

The research project was designed with a control/experimental pre/post test

design. The whole study lasted two school weeks, including all testing periods. The

purpose of the project was to establish whether computer troubleshooting had an

effect on problem solving skills. Each group included three boys and three girls,

matched according to their problem solving ability levels as assessed by their

teachers. Both groups received pre-assessment including two standardized tests, a

hands-on problem solving test, and a survey collecting information from the students

on problem solving skills, attitudes and math abilities. Following the two days of pre­

testing, the experimental group attended computer troubleshooting training sessions,

which were held for forty-five minutes in the morning before school over the course

of five days. The control group received no training. Following the training, the

students from the control and experimental group were post-tested. The post­

assessment included two standardized tests, the hands-on problem solving test and a

group interview modeled from the survey. The pre-tests and post-tests were

compared through statistical analysis, graphs and tables, as well as focusing on each

individual student's growth in a case study approach.

21
22
Research Setting

Introduction

Howard Poole, the principal investigator, is a faculty member in the

Educational Studies Department at Western Michigan University. While research

will be analyzed at Western Michigan University, the location of the actual study is at

Haigh Elementary School, 601 North Silvery Lane, Dearborn, MI 48128.

Collaborating investigators included: Anne Ottenbreit, (Western Michigan University,

Graduate Assistant EDT 347, Master's Student), who has used the research in

conjunction with her master's thesis and Sharon Ottenbreit (Dearborn Public

Schools), who was present during instructional periods in accordance with

compliances through the Dearborn Public Schools. The other expert mentioned later

on, Joel Ottenbreit (Ann Arbor Public Schools), has knowledge of the A+ curriculum

and a firm understanding of elementary students' problem solving abilities and

processes.

Haigh Elementary School Demographics and Area

The Haigh Elementary community is an upper-middle class socio-economic

status community. The area consists of mainly white individuals, however, a large

majority of the students within the white demographic are of Arabic descent.
23
Figure 3.1

The 48128 Zip Code Dearborn Ethnic Representation

Dearborn Demographics

111 African American


■ Asian
□Hispanic
□ White
■Arrerican Indian

The median household income in Dearborn is $44,560, while the median

income for families in Dearborn is $53,060. According to the census survey taken in

1999, twelve percent of the families in Dearborn are below the poverty level (United

States Department of Commerce, 2003).

School Information

Haigh Elementary School is one of twenty elementary schools within the

Dearborn Public School District. The Dearborn School district is a large district

consisting of 17,129 students. Haigh Elementary School consists of 390 students in

kindergarten through fifth grade. Four percent of students who attend Haigh

Elementary receive free or reduced lunches (GreatSchools Inc, 2003).


24
Figure 3.2

Students Ethnic Background at Haigh Elementary School

student Ethnicity

Ill African American


■Asian
□Hispanic
□ While

Curriculum

All fifth grade students at Haigh currently use one standard mathematics

curriculum and one standard technology curriculum. The technology curriculum was

designed by Dearborn Public Schools and is implemented throughout the district.

The mathematics curriculum currently uses Mathematics Plus as the standard format

and textbook for Dearborn Public Schools.

Technology Curriculum

Students in the Dearborn School District were operating primarily MacIntosh

computers. The training was conducted using Windows-based, IBM-compatible

computers and MacIntosh computers. MacIntosh computers operate in a similar

manner to IBM-compatible computers, but there is a large difference concerning

hardware. MacIntosh computers typically do not allow manipulation of the system, as

IBM-compatible computers do.


25
A survey taken by the Dearborn Public Schools in 1999 collected information

regarding the strengths of the current technology education curriculum. More than

three-fourths of elementary students (80%) surveyed had a computer at home. A large

majority (87%) of the students responded that they enjoyed using computers a lot.

According to the teachers, the usage of the computer labs greatly increased from the

original survey in 1992 (45%) to the most current survey in 2001 (80%). Students use

school computers for educational purposes such as; drawing, painting, writing stories

and reports, educational programs, encyclopedia work, the Internet, games, and for

CD-ROM reference (K-12 Computer Curriculum Committee, 2001).

The students' prior computer troubleshooting skills were minimal due to a

lack of computer experience. The 5th grade curriculum in technology is designed

through a number of scopes and sequences.

Math Curriculum

The current curriculum of problem solving involves mathematical story

problems based on the fifth grade math book MATHEMATICS PLUS, by Harcourt­

Brace and Company. The current math problem solving curriculum is comprised of

choosing strategies, methods of computation and operation, conducting simulations,

multi-step problems, relevant and irrelevant information and evaluating answers for

reasonableness. The curriculum expressed through Mathematics Plus, encompasses

several key elements


26
Figure 3.3

The Dearborn School District's Current Technology Curriculum

KNOWLEDGE
• Verify students' ability to start and quit applications
• Verify students' ability to log onto file server
• Verify students' ability to appropriately care for disks/CD's
• Reinforce students' ability to perform a warm start
• Reinforce students' ability to check cords and cables
• Develop students' ability to select hardware/software applications for tasks
• Develop students' ability to describe how technology meets human needs in the home,
school, community and workplace
• Develop students' ability to describe how people create use and control technology
• Develop students' ability to identify technology related careers
• Develop students' ability to describe advances in technology and their impact on society
• Develop students' ability to identify the computer hardware components
• Explore students' ability to use multimedia software
• Verify students' ability to understand what the CPU, monitor, keyboard, mouse and data
storage drive are
• Reinforce students' ability to use the printer
• Reinforce students' awareness of the network versus the stand alone computer
• Develop students' ability to understand/use special keys (ESC, CTRL, etc... )
• Develop students' ability to use touch typing method
• Develop students' ability to know and use icons
• Master students' ability to understand and use menus, function keys and buttons
PROBLEM SOLVING
• Develop students' ability to identify problems; find ways in which computers can solve
problems
APPLICATION
Word Processing
• Develop students' ability to manipulate font (size, style, etc... )
• Reinforce students' ability to print
• Reinforce students' ability to use spell check, and thesaurus
Internet
27
• Explore students' ability to use a browser
• Explore students' ability to enter a site location
• Explore students' ability to retrieve electronic information
• Explore students' ability to cite internet references
• Explore students' ability to demonstrate responsible use of the internet by adhering to the
district's Internet Usage Policy
• Explore students' ability to use a search engine
GENERAL SKILLS
• Develop keyboarding skills
• Verify students' ability to point/click
• MasterNerify students' ability to click and drag
• Develop students' ability to insert/delete
• Develop students' ability to save
• Develop students' ability to copy, cut and paste
SOCIAL AND ETHICAL ISSUES
• Develop students' ability to describe the impact of technology on daily lives
• Develop students' ability to identify ways various technology is used in the home,
school, community and workplace
• Develop students' ability to respect privacy and ownership of individual or organization
information or product
• Develop students' ability to articulate that individuals are responsible for their
technological actions and decisions
• Introduce students' ability to adhere to copyright, patent, and freedom of information
laws related to using technology
• Develop students' ability to describe how technology impacts information, information
access, analysis, organization and utilization

students must learn. First, students learned how to use a heuristic, which is a guide

for thinking. The textbook promotes the following process; (a) understand the

problem, (b) plan a solution, (c) solve the problem, and (d) look back. The

curriculum secondly directs students to use different types of problem solving


28
strategies. The strategies introduced by Mathematics Plus were (a) make and use

tables, charts, and graphs; (b) make a list, (c) guess and check, (d) find a pattern, (e)

draw and use pictures; (f) make and use models; (g) write a number

sentence/equation, (h) work backward, and (i) solve a simpler problem. The

curriculum lastly requires the students to communicate the process of problem

solving. The textbook concentrates on problem solving in order to allow students to

apply the heuristic guide they have learned. The Mathematics Plus textbook includes

a problem of the day in every lesson, asking students to apply different skills for

various types of problems.

Research Design

The research project measured the difference of students' ability to problem

solve and academic achievement prior to and following the computer troubleshooting

curriculum. Measures consisted of students' scores and performances on the Profiles

of Problem-Solving Standardized Test (POPS), IOWA Basic Skills Math Test, the

hands-on problem solving test and a group interview following the training sessions.

The students' abilities to solve computer problems were also analyzed. This study

was conducted using six students for the experimental group and six students for the

control group. The research project used a pre-test/post-test quasi-experimental

research design with a control and experimental group. The experimental group

received the treatment and both groups were pre-tested prior to the training, and post­

tested after the training was complete. The entire project protocol took place at Haigh

Elementary School over the course of two weeks. The researcher chose to perform
29
the study over the course of two weeks in order to reduce the external validity issues

associated with the effects of information they gained from school. Subject

recruitment started after the HSIRB approval and the beginning of the baseline data

collection was initiated on May Ii\ 2003. A timeline of the training sessions is

presented in figure 3.4.

Figure 3.4

Timeline of the Training Sessions

Week 1:
Day 1: Pre-Testing
Day 2: Pre-Testing/Getting to Know You and the Computer
Day 3: Lesson 1: Outer Hardware, Intro to Hardware on the Inside
Day 4: Lesson 2: Hardware on the Inside
Day 5: Lesson 3: Storage, Files and Folders, The Windows Desktop
Week 2:
Day 1: Lesson 4: Knowing Your System, Programs, Operating Systems, Computer Care and
Safety
Day 2: Lesson 5: Troubleshooting Real Problems
Day 3: Post-Testing
Day 4: Post-Testing

Computer Troubleshooting Curriculum

The primary purpose of the computer training sessions was to provide students

with the experience to apply problem solving methods in an authentic, hands-on

manner that would easily transferable to other situations.

The experimental group received the training for forty-five minutes, over the

course of two weeks on May 21 st, 22nd, 23rd, 2ih and 28 th. The curriculum was

designed and instructed by the researcher. The researcher used a combination of


30
CompTIA's A+ objectives and Kids Domain computer lessons to create a

troubleshooting curriculum conducive to elementary learning. Lesson plans providing

detailed information on the training sessions can be found in the Appendix P.

A+ Curriculum

The CompTIA A+ certification is the industry standard for validating vendor­

neutral skills expected of an entry-level computer technician. Technicians with A+

certification have a firm knowledge of and competency in computer hardware and

operating system including installation, configuration, diagnosing, preventive

maintenance and basic networking. A+ certification provides the perfect outline for

objectives that must be accomplished in order to affirm successful ability to

troubleshoot computers (The Computing Technology Industry Association, Inc,

2002).

The training sessions were designed to mimic the CompTIA manual, but

written in a manner conducive to elementary student's ability to learn. The researcher

modified the program to meet the needs of elementary students. The training sessions

taught students the hardware components of a computer, how to build a computer,

how to troubleshoot basic problems associated with hardware and software, and other

functions associated with the computer. These learning objectives assisted in the

understanding of how a computer functions. A list of primary objectives are available

in figure 3.5.
31
Figure 3.5

Primary Objectives from the CompTIA A+ Curriculum

• Recognize and be able to state the name and purpose of each hardware element as
listed below
Motherboard
Power Supply
Processor /CPU
Memory
Storage devices
Monitor
Modem
Peripheral
BIOS
CMOS
LCD (portable systems)
Ports
PDA (Personal Digital Assistant)

• Insert and remove all hardware as specified below


Examples of modules: Portable system components
• Motherboard • AC adapter
• Storage device • Digital camera
• Power supply • PC card
• Processor/CPU • Pointing devices
• Memory
• Input devices
• Hard drive
• Keyboard
• Video Card
• Mouse
• Network Interface Card
(NIC)
32
• Install and configure hard drives, video cards, printers, processors and memory.
• Diagnose symptoms and problems related to computer hardware.
• Troubleshoot basic hardware problems through typical procedures.
• Use preventative maintenance in order to ensure the safety and upkeep of the
technology, themselves and others.
• Use safety procedures in order to ensure the safety and upkeep of the technology,
themselves and others.
• Troubleshoot, care for and service printers.
• Indicate which operating system a computer 1s currently running and the mam
functions of an operating system.
• Install, configure and upgrade windows XP operating system.
• Install and launch applications.
• Diagnose and troubleshoot basic problems associated with the operating
system and applications.

Kids Domain Computer Lessons

Kids Domain.Com offers a large variety of child friendly activities and lesson

plans. The explanation, organization and presentation of the material was targeted

toward elementary students. The curriculum for the computer troubleshooting

training sessions was adapted from the Kids Domain website. The researcher

contacted the company for permission to use the material, and was granted permission

by the company through email. The email documentation can be found in Appendix

V.

Instructional Method

The students received instruction through hands-on learning expenences;

physically installing and troubleshooting mam parts of a computer, increasing


33
knowledge through interaction with the components and problems. The students also

received troubleshooting simulations, providing real experience with problem-solving

situations. These authentic learning situations supplied students with the opportunity

to apply their new skills and solutions, while receiving immediate feedback. The

students were given handouts and other supplementary learning tools to enhance

learning within the curriculum. The students were instructed through individual, team

and whole group settings, providing various formats/strategies of instruction in order

to solidify learning concepts. Demonstrations and discussions pertaining to specific

problems and different solutions were presented to the whole group. Teams of two

students were implemented for all hardware and software installations in order to

facilitate questioning and answering between pairs. This configuration was able to

fully utilize the equipment to produce the maximum learning experience for the

students. Worksheets and individualized instruction were provided in order to further

solidify the learning concepts.

Subjects

The subject selection was limited to students who submitted a signed

perm1ss10n slip and parent consent form. Students without these documents

completed were not eligible to participate. Out of the students who volunteered to

participate in the training, the students were selected using specific criteria. The first

disaggregate was gender, due to the vast differences in computer and technical ability

found in previous research in relation to gender (Fey, 2001; Frantom, Green &

Hoffman, 2002; Suomala & Alajaaski, 2002). The researcher first selected three girls
34
and three boys for the experimental group and three girls and three boys for the

control group. The second disaggregate was based on the students' school attendance

record over the past year. Students with a good attendance record were the most

desired since absences and tardiness could have affected the end results of the project.

The third disaggregate was the level of problem solving abilities. The researcher had

the fifth grade teachers rank the students on their problem solving ability; low,

medium, and high. The researcher did not attempt to collect any particular scores;

rather, the focus of the level of problem solving ability was used to match students

between the experimental and control groups. To ensure the lack of favoritism or

perception of favoritism in selecting students for the experimental versus the control

group, the matched subjects were randomly assigned to groups. Though recruiting a

true random sample was extremely unlikely, this procedure achieved the closest

random sample approximation possible, given the nature of voluntary participation of

students.

Criteria for Students in the Control Group

The student subjects were not selected, as all of the students in the fifth grade

were invited to participate. Subjects were volunteers and permission-based

participation was required. Students involved in the control group were selected

based on; gender, attendance and similar problem solving skills compared to students

in the experimental group in order to decrease the possible threats to the validity of

the data.
35

Criteria for Students in the Experimental Group

The student subjects were not selected, as all of the students in the fifth grade

were invited to participate. Subjects were volunteers and permission-based

participation was required. Students involved in the experimental group were

selected based on; gender, attendance and similar problem solving skills to students in

the control group in order to decrease the possible threats to the validity of the data.

Variables

Dependent Variables

Treatment

The computer troubleshooting activity was videotaped to ensure the instructor

exhibited no favoritisms or other forms of biases during the sessions. The videotape

was viewed and evaluated at a later time by an A+ expert, Joel Ottenbreit. The expert

indicated no researcher biases existed.

Subject Selection Factors

The subjects were selected based on several dependent variables.

Attendance

Attendance was a large factor for selection of subjects. The training sessions

contained so much information, it was pertinent that the students attend all training
36
sessions and were punctual. The teachers were asked to rate the students based on

attendance and tardiness over the past semester. All students were rated based on

good attendance and number of tardies. The students rated with a high number of

tardies and absences were dismissed from the study, in order to decrease as many

variables as possible.

Problem Solving Skills

The researcher had the fifth grade teachers rate the students' problem solving

abilities as; (a) high, (b) medium, or (c) low. Once subjects with multiple tardies and

absences had been dismissed from the project and the research subjects had been

separated by problem solving ability, students were matched based on gender.

Gender

The researcher selected students based on gender due to the large difference

between boys and girls in learning and computer activities. The researcher was able

to selected three girls and three boys for each group in order to balance the groups for

companson purposes. The information was collected from the permission slips

returned by the students who wanted to participate in the study. The researcher

randomly selected three pairs of matched problem solving ability girls and three pairs

of matched problem solving ability boys and placed one student from each pair in the

experimental group and the other student in the control group.


37
Figure 3.6

POPS Test Results Compared Between POPS Administrative Test and the Computer
Troubleshooting Study Students

r-- - ----- -A·verages Compared Between POPS Administrative Test Mean Scores

C Computer Troubleshooting Study Students
■ POPSAdministrativeStudyStudents and the Computer Troubleshooting Study Mean Pre-Test Scores


••
Ill

I:

Correctnessof Extracting Quality of


MethodsUsed Accuracy
Answer Information Explanation

C Computer Troubleshooting Study Students 5.25 5,5 5.666666667 5.25 3.75

■ POPS Administrative Study Students 7,8 5.5 6.2 2.7


Categories of POPS

When the computer troubleshooting study students were compared to the

POPS Administrative Test, preformed by the testing company involving 371 subjects,

the subjects involved in this research project scored lower in the Correctness of

Answer, Methods Used and Extracting Information sections. However, they scored

higher, on average, in the Accuracy and Quality of Explanation sections, than their

peers involved in the POPS Administrative Test. However, most differences were not

substantial, and therefore, students chosen were representative of their peers based on

the POPS Administrative Test preformed by the authors.


38
Independent Variables and Data Collection Process

There were multiple independent variables which affected the results of this

project. For each method of assessment, the students received a code test cover sheet

indicating the students' name, code name and method of assessment (See Appendix

r). These measures were for organizational purposes only. The tests and other forms

of assessment were identifiable only by code numbers. The master code list was only

available to the researchers.

Testing Procedures

Before every test or training, students were read the student assent form,

indicating they would receive no extra credit, and even if they agreed to participate

they could change their minds at any time throughout the testing or training (See

Appendix B). They were also reminded that they were volunteers and were free to

stop participating whenever they chose without any penalties for quitting.

Once the method of assessment was completed, the students were instructed to

notify the researcher. The researcher would then return the answer sheet and test

booklet into a manila folder. The students who finished were given the next

assessment until all methods of assessment were completed.

The students completed the rowA Basic Skills Math Problem Solving and
Data Interpretation twenty-six item section of the rowA test. The mathematical

problem solving standardized test took approximately twenty to twenty-five minutes

to complete and was conducted in Room 4 of Haigh Elementary School under the
39
superv1S1on of a Dearborn Public School teacher. The teacher, as well as the

researcher was available to help read the items and answer questions. The researcher

read the assent form to the students before testing. To ensure confidentiality, a cover

sheet was included listing the student's name and ID number. The student's converted

ID number was the only form of identification on all assessments. The converted ID

number was randomly assigned by the researcher. The code sheet containing the

student's name, student identification number and the test score will be kept in a

locked file cabinet in a university office that only researchers will have access to. The

researcher explained each test's instructions, asked the students to complete the

standardized tests and raise their hands when finished. As each individual completed

the first test, the researcher distributed the second standardized test, POPS, with the

code sheet and explained the instructions individually. Once the tests were

completed, the assessment was placed in the corresponding folder for organizational

purposes. The second problem solving test, POPS, took approximately thirty minutes

to complete and was administered in the same manner on the same day. Due to time

constraints, the students were able to take two days for pre-testing and two days for

post-testing. The tests were collected in the same manner as the first standardized

test. Throughout the testing period, students were randomly asked to take the hands­

on problem solving test, which was videotaped in the back of the classroom. Once

the students had completed all three tests, the students were given a self-assessment

survey. However, due to the time constraints of two days for pre-assessment and two

days for post-assessment, all students were not able to finished the survey.
40
ETS Services

The researcher used two of the five evaluation materials from the ETS Test

Collection. The Educational Testing Service Test Collection Database allows access

to tests according to the following terms of use.

IOWA Basic Math Skills Test

The IOWA Test of Basic Skills Math Problem Solving and Data Interpretation

measured the subjects' mathematical problem solving ability. Students in the

experimental group and control group both were pre-tested before school in Room 4.

The students were given instructions from the IOWA Test of Basic Skills Teacher's

Manual. The test took approximately thirty minutes for each student to complete,

containing twenty-six questions. The students recorded their answers on a multiple

choice standardized testing answer sheet. The answer sheets were collected and

scored by the researcher using the IOWA Test of Basic Skills answer key. The data

was compared between pre-test and post-test scores to establish differences. In

addition, the data will be part of the formative evaluation process to assess

effectiveness of program activities and to guide development of future programs.

The researcher chose to use the IOWA Basic Skills Math Test to measure the

students' mathematical problem solving ability, due to the recommendation obtained

from the Mental Measurements Yearbook Review Online (Figure 3.8). The

researcher only used the Math Problem Solving and Data Interpretation section of the

test, in order to keep testing time to a minimum. Brief information concerning the
41
IOWA test is located below, but a full description, including reviews 1s located in

Appendix M.

Figure 3.7

ETS Services Disclaimer

PLEASE READ THESE TERMS OF USE CAREFULLY BEFORE USING THE TEST
COLLECTION DATABASE. BY USING THE DATABASE, YOU AGREE TO THESE
TERMS OF USE. IF YOU DO NOT AGREE TO THESE TERMS OF USE, PLEASE DO
NOT USE THE DATABASE.

"The ETS Test Collection provides microfiche copies of certain unpublished test as a service to
educators and psychologists. It is hoped that these materials will provide users with creative ideas
for the development of their own instruments, or, in some instances, with measures of attributes for
which no published tests are available.

The materials included on the microfiche may be reproduced by the purchaser for his own use until
otherwise notified by ETS or the author. Permission to use these materials in any other manner must
be obtained directly from the author. This includes modifying or adapting the materials, and selling
or distributing them to others. Any copyright notice or credit lines must be reproduced exactly as
provided on the original.

Typically, the tests included in this service have not been subjected to the intensive investigation
usually associated with commercially published tests. As a consequence, inclusion of a test does not
imply any judgment by ETS of the quality or usefulness of the instrument. The purchaser must
assume full responsibility for controlling access to these materials, the manner in which they are
used, and the interpretation of data derived from their application.

It is recommended that access to these microfiche be limited to students conducting research, staff
members of professionally recognized educational and psychological institutions or organizations,
and individuals who are members of the American Educational Research Association, the American
Psychological Association, the National Council on Measurement in Education, or the Association
of Measurement and Evaluation in Guidance. The qualifications of others not in these categories
should receive careful consideration.
42
Finally, the purchaser is urged to provide information about his use of these materials directly to the
authors. Many cooperating authors are interested in collecting data on their instruments, which will
make them more useful to others. Therefore, it is to the advantage of everyone concerned - authors,
present users, and users in the future - that purchasers recognize their professional responsibility to
initiate such communication. The address of the author of each instrument as of the date on which
the series is released is listed on this notice that appears frrst on each download test."
(http://testcollection.ets.org/cgilswebmnu.exe?ini = TESTCOLL&act = 3&/ang=&uid=pub/ic&idck=
&eid = &tid = 8955229401-0j)

POPS Problem Solving Test

The second standardized test used in order to assess students' ability to

process problem solving situations, was the Profiles of Problem-Solving (POPS)

Standardized Test. Students began this test after completing the IOWA Basic Math

Skills Test and were given individualized instructions for the test from the researcher.

The 6-question test took approximately 20-25 minutes for each student to complete

and was given in the same room as the IOWA Basic Math Skills. The students

recorded their answers on an answer sheet, explaining their answers with drawing,

words and number sentences. The answer sheets were collected and scored by the

researcher using the POPS Answer Booklet.

Figure 3.8

IOWA Test of Basic Skills Review

Test Name: Iowa Tests of Basic Skills Forms KL and M


Test Author: Hoover-H; D; Hieronymous-A; N; Frisbie-D; A; Dunbar-S; B.
Publication Date: 1955-1996
Scores: Vocabulary, Listening, Language, Language Total, Mathematics, Core Total, Word Analysis
43
Information, Composite, Language Advanced Skills, Mathematics Advanced Skills, Survey
Battery Total, Reading Comprehension, Spelling, Capitalization, Punctuation, Usage and
Expression, Mathematics Concepts and Estimation, Mathematics Problem Solving and Data
Interpretation, Mathematics Total, Maps and Diagrams, Reference Materials, Sources of
Information Total, Composite
Reviewer: Brookhart-Susan-M; Cross-Lawrence-H
Review Indicator: 2 reviews available
Publi.sher: The Riverside Publishing Company 8420 Bryn Mawr Ave Chicago IL 60631
Acronym: ITBS
Mental Measurements Yearbook: 13 Mental Measurements Yearbook
Accession Number: 13012057

A. Purpose
"To provide a comprehensive assessment of student progress in the basic skills."
B. Population
Grades K.1-1.5, K.8-1.9, 1.7-2.6, 2.5-3.5, 3, 4, 5, 6, 7, 8-9 ...LE-10: 5, 6, 7, 8, 9, 10, 11, 12,
13, 14.
C. Scores
Vocabulary, Listening, Language, Language Total, Mathematics, Core Total, Word Analysis
(optional), Mathematics Advanced Skills, Mathematics Total, Reading Advanced Skills,
Reading Total, Reading, Listening Language, Mathematics Concepts, Mathematics
Problems, Mathematics Computation [optional], Social Studies, Science, Sources of
Information, Composite, Language Advanced Skills, Mathematics Advanced Skills, Survey
Battery Total, Reading Comprehension, Spelling, Capitalization, Punctuation, Usage and
Expression, Mathematics Concepts and Estimation, Mathematics Problem Solving and Data
Interpretation, Mathematics Total, Maps and Diagrams, Reference Materials, Sources of
Information Total, Composite.
D. Time
(130-310) minutes for Complete Battery; (100) minutes for Survey Battery
E. Comments
Part of Riverside's Integrated Assessment System; Braille and large-print editions available
44
Reading Total, Reading, Listening Language, Mathematics Concepts, Mathematics
Problems, Mathematics Computation [optional], Social Studies, Science, Sources of
Information, Composite, Language Advanced Skills, Mathematics Advanced Skills, Survey
Battery Total, Reading Comprehension, Spelling, Capitalization, Punctuation, Usage and
Expression, Mathematics Concepts and Estimation, Mathematics Problem Solving and Data
Interpretation, Mathematics Total, Maps and Diagrams, Reference Materials, Sources of
Information Total, Composite.
D. Time
(130-310) minutes for Complete Battery; (100) minutes for Survey Battery
E. Comments
Part of Riverside's Integrated Assessment System; Braille and large-print editions available

Test and were given individualized instructions for the test from the

researcher. The 6-question test took approximately 20-25 minutes for each student to

complete and was given in the same room as the IOWA Basic Math Skills. The

students recorded their answers on an answer sheet, explaining their answers with

drawing, words and number sentences. The answer sheets were collected and scored

by the researcher using the POPS Answer Booklet.

The researcher chose the test due to the recommendation obtained from the

Mental Measurements Yearbook Review Online (Figure 3.9). Once the testing

agency was contacted in order to purchase the test, the agency informed the researcher

the name of the test was changed from the Surveys of Problem Solving (SPRS) to the

Profiles of Problem Solving (POPS).

The POPS test divided each student evaluation into five separate categories.

The first category, Correctness of Answer, was a measure of whether the answer was

correct. The Methods Used category measured the approach used, focusing more on
45
the plan rather than the calculations. The more systematic plan the student

developed, or if the student used a pattern, the higher the score. The Accuracy

category measured the ability to calculate, focusing more on the mathematical aspect

of problem solving. The Extracting Information category measured the extent to

which the student understood the problem, relevant facts and relationships between

variables. Lastly, the Quality of Explanation category measured the student's ability

to communicate the problem solving process. Each category was present in multiple

questions and graded separately for each student. Every category was divided into

three possible levels of achievement; beginning, developing and advanced. Each

student was graded on the pre-test and post-test, using the three possible levels

Brief information concerning the POPS test is located below (See Figure 3.9)

but a full description, including reviews is located in the Appendix N.

Hands-On Problem Solving Test

Throughout each testing period, students were randomly asked to take the

hands-on problem solving test, which was videotaped in the back of the classroom.

The researcher allowed students ten minutes to complete two activities. The hands-on

problem solving test was videotaped in order to provide evidence of problem solving

methods the students used. The researcher reviewed the hands-on problem solving

test at a later time, evaluating through the use of the hands-on problem solving rubric

(See Appendix J). The hands-on problem solving test was based off of problem

solving activities using tangrams. The researcher established two activities for the
46
Figure 3.9

POPS -Profiles of Problem Solving Review

Test Name: Profiles of Problem Solving


Test Author: Stacey-Kaye; Groves-Susie; Bourke-Sid; Doig-Brian
Publication Date: 1993
Scores: 5: Correctness of Answer, Method Used, Accuracy, Extracting Information, Quality of
Explanation
Reviewer: McLellan-Mary-J; Medina-Diaz-Maria
Review Indicator: 2 reviews available
Publisher: Australian Council for Educational Research 19 Prospect Hill Road Private Bag 55
Camberwell Victoria 3124 Australia
Acronym: POPS
Mental Measurements Yearbook: 13 Mental Measurements Yearbook
Accession Number: 13081532

F. Purpose

"An assessment of mathematical problem solving designed for children in upper primary school".
G. Population

Grades 4-6.
H. Price

1993 price data: $75 per manual and photocopiable masters.


I. Administration

Group
J. Scores

5: Correctness of Answer, Method Used, Accuracy, Extracting Information, Quality of


Explanation.
K. Manual

Manual, 1993, 64 pages


L. Time

[32]40 minutes
M. Comments

researcher established two activities for the hands-on test; an easy and difficult

tangram problem. The students had a maximum of five minutes to complete each

problem to the best of their ability. The post test each student was given utilized the
47
same tangram set as the pre test. The book and tangrams used can be found in the

Appendix 0. Dr. Poole, Sharon Ottenbreit and the three fifth grade teachers assessed

the tangram problems, in order to assure the test was at an appropriate level. The test

assessed the different methods students used solving hands-on problems requiring

manipulation, the amount of time taken to solve the problem and the number of

attempts made by the student. The researcher compared data on methods students

used to solve problems prior to, as opposed to after the training. The researcher also

compared data between the control and experimental groups to see if there was any

difference between groups.

Survey

The student self-assessment surveys gathered the students' perception of their

own problem solving abilities and methods they consciously apply to problems. The

survey provided information on how students solve problems prior to the training.

The surveys also provided a comparison for the group interviews, which followed the

training serving as a post test. However, due to time constraints, all of the students

could not complete the surveys. The data was part of the process to better design the

computer troubleshooting activities and to guide development of future educational

technology programs. The survey has been modified from "Student Thinking About

Problem Solving Scale", used in a previous thesis project (Armour, 1986).

The survey was a last priority for pre-assessment and was given to the students

if they were able to complete all the tests within the first two days of the project.

Therefore, not all students were able to complete the surveys. The survey consisted of
48
ten questions. The first seven questions used the Likert scale to evaluate how

students felt about problem solving and their problem solving ability. The last three

questions were short response answers concerning methods they use to solve

problems. Surveys included the cover sheet consisting of the student's name and

assigned ID number; only randomly assigned ID number was on the actual survey.

Only the researchers had access to the name/ID list. When finished with the survey,

the students returned the survey to the researcher. The sample survey, as well as the

survey it was adapted from is located in the Appendix G.

Group Interview

The researcher conducted the interview as a group interview, due to time

constraints. The interview was originally planned as a form of post assessment to the

survey, but since all students did not complete the survey, answers were combined

from the survey and interview to create a large database of information. Students

from the control and experimental group were both present in the same room, at the

same time. Each student was supplied with an interview sheet, pencil and clipboard

so the interview could be conducted in a circle, on a floor rug. The researcher read

each question aloud, and after every child was finished, the students shared their

responses with the group. The group interview was videotaped in order to collect all

of the anecdotal data contributed through oral conversations. The researcher

transcribed the videotape at a later time. As in the survey, the first seven questions

were based on a rating likert-scale. The last three questions were short response

questions related to problem solving methods. An additional question was added at


49
the end of the Interview, asking the students' opm10ns of whether the computer

troubleshooting training sessions could make a difference on problem solving

abilities. The list of interview questions is attached in the Appendix H.

Troubleshooting Activity

The final computer troubleshooting activity was videotaped for additional

data. In the troubleshooting activity, the students worked in teams of two on the

stations. There were seven stations total for the students to complete. At each

station, there was a different problem the students needed to identify, solve/fix and

check to see if they accomplished the solution. The students contacted the researcher

to verify the completion of a station and the station was then prepared for the next set

of students. Each station was equipped with a worksheet for students to record their

answers. These worksheets were collected for data. Examples of the worksheets, as

well as the spreadsheet of answers to each question can be found in Appendix W.

The videotape and completed worksheets documented the students' ability to

solve common computer problems. In addition to this data collected, additional

observations from the instructor helped analyze the level of troubleshooting skills.

Data Analysis

The data from the standardized tests scores and hands-on problem solving

time scores were analyzed along with other factoring variables in order to correctly

analyze the data. The results of test scores form a two-by-two design that was

analyzed with a Paired-Sample T-Test. The SPSS program was used to compute
50
analyze the data. The results of test scores form a two-by-two design that was

analyzed with a Paired-Sample T-Test. The SPSS program was used to compute

correlation statistics, and appropriate t-tests analyses. The computed correlation

statistics, mean and standards, percentage frequencies, and correlation coefficients

were used to draw inferences from the collected data.· However, since samples were

so small the results were inconclusive. Comparisons between group scores and

student responses were analyzed for disparities and congruencies between their

perceptions and scores concerning problem solving. Variables measured through the

observation, survey and interview data were also analyzed in conjunction with data

from the rest of the database. The ordinal data collected from the hands-on problem

solving test scores was analyzed. Data was compared between the control group

improvements and experimental group improvements. The surveys and interviews

were used to gather information on student problem solving methods and processes.

Through the addition of extensive analysis, the researcher expected to be able to make

summative evaluation statements regarding the computer training program and

estimate the effectiveness on problem solving.

Hypothesis 1

The first hyp othesis was that elementary students would acquire the ability to

troubleshoot common computer problems successfully. In order to evaluate this

hypothesis, three separate collections of data were analyzed. The first data collected

and analyzed for hyp othesis one was the Troubleshooting Activity. The researcher

divided the students up into teams of two and documented the teams as they
51
attempted to solve the different computer station problems. The researcher

documented whether students were able to solve the common computer problems at

each station.

Hypothesis 2

The second hypothesis was elementary students involved in the computer

troubleshooting curriculum would improve their problem solving methods as

compared to student who did not participate. In order to evaluate this hypothesis, two

separate collections of data were analyzed. The first data collected and analyzed was

the POPS - Profiles of Problem Solving Methods Used section. The research

compared pre-test and post-test scores to establish the improvement of each student,

as well as average improvements for each group. The researcher also analyzed the

hands-on problem solving test, including number of attempts, time completion of the

easy problem and the number of students in each group able to solve the different

problem.

Hypothesis 3

The research assessed the most difficult procedure in problem solving for

elementary students by using findings from the Profiles of Problem Solving test.

Each category of the problem solving process was separately analyzed by observing

pre-test scores of the students. The second source of findings used to discover the

most difficult problem solving procedure was the survey/group interview. The

researcher grouped the responses into categories based on key phrases and evaluated

the most identified problem.


52
Hypothesis 4

Information was collected from the rowA Basic Skills math problem solving

and data interpretation test and the group interview, in order to analyze whether

mathematical ability would be affected by the training sessions. The rowA test

results were analyzed for the average scores of each group. The group interview

provided additional anecdotal data.

Hypothesis 5

Gender differences were evaluated through the Profiles of Problem Solving

test and the hands-on problem solving test. The POPS pre-test and post-test total

scores were compared between genders, as well as each category of the POPS test.

Additional results were collected from the hands-on problem solving test. The

researcher compared the completion time for the easy problem from the pre-test to the

post-test. The findings for the number of attempts to solve each problem were also

studied from the hands-on problem solving test to analyze the affects of gender on the

results of the study.

Hypothesis 6

Students classified by teachers as having high, medium and low problem

solving abilities were compared to evaluate if the variable was a significant factor.

Average improvements in total POPS score for each group were analyzed. The

hands-on problem solving test was analyzed for improvement in time completion of

the easy problem and the increase in the average number of attempts between the pre­

test and the post-test. The number of students able to solve the difficult problem was
53
also analyzed.

Hypothesis 7

Information from the Profiles of Problem Solving test, the hands-on problem

solving test and the survey/interview were all used to investigate the final hypothesis.

The mean total POPS score was averaged for each group, comparing the pre-test total

score to the post-test total score. The hands-on problem solving test results were

evaluated for each group, analyzing information on improvement in time completion

of the easy problem, the average number of attempts in both problems from the pre­

test to the post-test, and the percentage of students able to solve the difficult problem.

The last collection of data analyzed to address the hypothesis was the final group

interview. Information from question eleven was an additional source of information.

Summary

Table 3.1

Pre-Testing and Post-Testing Organization

Test Pre/Post Date Method of Evaluation


PRE PRE
IOWA Math Test Pre-Test 5/19-5/20 • 26-item test
• Questions correct out of
26
POPS - Profiles of Pre-Test 5/19-5/20 • 6-item test
Problem Solving • Answers based on POPS
Standardized Test Answer Booklet
54
Hands-On Problem Pre-Test 5/19-5/20 • Attempts
Solving Test: • Time
Tangrams • Solution
(Easy & Difficult Correct/Incorrect
Problem)
Survey Pre-Test to 5/19-5/20 • Likert Scale
interview • Anecdotal Data
POST POST POST 1rPOST
IOWA Math Test Post-Test 5/29-5/30 • 26-item test
• Questions correct out of
26
SPRS Problem Post-Test 5/29-5/30 • 6-item test
Solving • Answers based on POPS
Standardized Test Answer Booklet
Hands-On Problem Post-Test 5/29-5/30 • Attempts
Solving Test: • Time
Tangrams • Solution
(Easy & Difficult Correct/Incorrect
Problem)
Interview Post-Test to 5/29-5/30 • Likert Scale
survey • Anecdotal Data
Troubleshooting Post-Test 5/28 • Correct Solutions
Activity Experimental • Able to apply Problem
Only Solving Methods
55
CHAPTER IV

FINDINGS

Introduction

Six students participated in five hours of computer troubleshooting training,

over the course of two weeks in order to increase their knowledge of computer

troubleshooting, thereby improving their ability to solve problems. The prediction for

the research project was that over the course of training, the subjects would improve

in their ability to solve computer problems. It was further predicted that as the

subjects began to improve in their ability to solve computer problems, they would

also begin to improve in general problem solving ability. Finally, it was predicted

through the comparison of pre and post assessments, students would show evidence

of improved achievement in general problem solving and mathematical problem

solving achievement.

Explanation of Student's Profile

Each student's results are described in a brief profile below. The specific data

and information pertinent to each hypothesis is described later on in the chapter. The

individual's profile begins with a description of each student's demographic

information and a short summary of the student's results. Each student's summary

includes charts of their results on each form of assessment. The first chart describes

basic information about the student and their results on the IOWA Basic Skills math

problem solving and data interpretation subtest.


56
Table 4.l a

Example Table of the Columns Explained in Table A.

Student Identification and IOWA Results

Student Group Gender Teacher-Rated IOWA IOWA Post-IOWA Difference


Code Number Problem Solving Pre-Test Test
Ability
Student's randomly Control Male (M) Teacher-Rated This is the This is the This is the
assigned code (C) or Female Problem Solving number number number of points
number Or (F) Ability: correct out of correct out difference
(1-12) Experimental (E) High, Medium, or 26 on the pre- of 26 on the between the pre-
Low test post-test test and the post­
test

The second table shows the student's results on the third part of the Hands-On

Problem Solving Test.

Table 4.lb

Example Table of the Columns Explained in Table B.

Easy Hands-On Problem Solving Results

Pre: Attempts Post: Pre: Time Post: Time Pre: Solution Post: Solution

Attempts

How many How many How long it How long it Whether the student Whether the student

times the times the took the took the solved the problem solved the problem

student student student to student to correctly, correctly, incorrectly,

attempted the attempted the complete the complete the incorrectly, or did or did not solve the

problem problem problem problem not solve the problem during the

again during again during during the pre- during the problem during the post-test

the pre-test the post-test test post-test pre-test


57
The third table shows the student's results on the second part of the Hands-On

Problem Solving Test.

Table 4.lc

Example Table of the Columns Explained in Table C.

Difficult Hands-On Problem Solving Results

Pre: Attempts Post: Pre: Time Post: Time Pre: Solution Post: Solution

Attempts

How many How many How long it How long it Whether the student Whether the student

times the times the took the took the solved the problem solved the problem

student student student to student to correctly, correctly,

attempted the attempted the complete the complete the incorrectly, or did incorrectly, or did

problem again problem again problem during problem during not solve the not solve the

during the during the the pre-test the post-test problem during the problem during the

pre-test post-test pre-test post-test

The final table shows the POPS, Profiles of Problem Solving Test. The test is

broken up into five separate sections; Correctness of Answer (COA), Methods Used

(MU), Accuracy (A), Extracting Information (El) and Quality of Explanation (QE).

Each of these sections evaluated a different part of problem solving ability and the

points total range. The total score is also listed.


58
Table 4.ld

Example Table of the Columns Explained in Table D.

POPS- Profiles of Problem Solving

COA: Correctness of Answer

MU: Methods Used

A: Accuracy

EI: Extracting Infonnation

QE: Quality of Explanation

COA: Pre COA: Post MU: Pre MU: Post A: Pre A: Post

The number of The number of The number of The number of The number of The number of

correct points correct points correct points on correct points correct points on correct points on

on the pre-test. on the post­ the pre-test. on the post-test. the pre-test. the post-test.

test.

EI: Pre EI: Post QE: Pre QE: Post Total: Pre Total: Post

The number of The number of The number of The number of The number of The number of

correct points correct points correct points on correct points correct points on correct points on

on the pre-test. on the post- the pre-test. on the post-test. all of the pre-tests. all of the post-

test. tests.

Profile of Each Student's Assessment Data

Female

Student 1 Control

Subject #1 was a female student in the control group, who was rated by her

teacher as a high ability problem solver. The student had good attendance throughout
59
the school year. The student was matched up with student number 6 in the

experimental group. The student was shy, quiet and reserved during all periods of

assessment. The student stayed constant in the pre and post assessments of the easy

hands-on problem solving test, solving both correctly. She also maintained the same

number of attempts and the amount of time to complete the difficult hands-on

problem solving tests, however, she solved the problem correctly during the post-test.

She was the only student to dramatically improve on the IOWA test. The student's

results on the POPS post-test increased in the Correctness of Answer, Accuracy and

Extracting Information sections, improving her total POPS score by four points.

Below is documentation of her scores throughout the project.

Table 4.2a

Student Identification and IOWA Results

Student Group Gender Teacher-Rated Problem Solving IOWA Pre IOWA Post IOWA Difference

Ability

C F High 14 21 7

Table 4.2b

Easy Hands-On Problem Solving Results

Pre: Attempts Post: Attempts Pre: Time Post: Time Pre: Solution Post: Solution

1 min 1 min Solved Correctly Solved Correctly

Table 4.2c

Difficult Hands-On Problem Solving Results

Pre: Attempts Post: Attempts Pre: Time Post: Time Pre: Solution Post: Solution

3 3 5 5 Not Solved Solved Correctly


60
Table 4.2d

POPS- Profiles of Problem Solving

COA: Correctness of Answer

MU: Methods Used

A: Accuracy

EI: Extracting Information

QE: Quality of Explanation

COA: Pre COA: MU: Pre MU: A: Pre A: EI: EI: QE: Pre QE: Total: Pre Total: Post

Post Post Post Pre Post Post

2 4 4 3 4 4 6 2 2 14 18

Student 2 Experimental

Subject #2 was a female student in the experimental group, who was rated by

her teacher as a high ability problem solver. The student had good attendance

throughout the school year. The student was matched up with student number 5 in

the control group. The student was shy, quiet and polite during all periods of

assessment. The student participated actively in all computer and troubleshooting

activities. The student showed knowledge and understanding of the topic as the

training sessions proceeded. The student's results showed slight differences between

the pre and post easy hands-on problem solving test, solving both correctly. She

increased the number of attempts on the difficult problem during the post-test,

however, she was unable to solve the problem during either test. She was consistent

with all of the subjects, improving two points on the IOWA test. The student's
61
results on the POPS post-test increased in the Correctness of Answer, Methods

Used, Extracting Information and Quality of Explanation sections, improving her

total score by ten points. This student's improvement in the POPS test was the most

dramatic of all of the subjects. Below is documentation of her scores throughout the

project.

Table 4.3a

Student Identification and IOWA Results

Student Group Gender Teacher-Rated Problem Solving rowA Pre IOWA Post rowA Difference
Ability

2 E F High 21 23 2

Table 4.3b

Easy Hands-On Problem Solving Results

Pre: Attempts Post: Attempts Pre: Time Post: Time Pre: Solution Post: Solution

3 2 2.25 2.5 Solved Correctly Solved Correctly

Table 4.2c

Difficult Hands-On Problem Solving Results

Pre: Attempts Post: Attempts Pre: Time Post: Time Pre: Solution Post: Solution

4 5 5 5 Not Solved Not Solved


62
Table 4.3d

POPS- Profiles of Problem Solving

COA: Correctness of Answer

MU: Methods Used

A: Accuracy

EI: Extracting Information

QE: Quality of Explanation

COA: Pre COA: MU: Pre MU: A: Pre A: El: El: QE: Pre QE: Total: Pre Total: Post

Post Post Post Pre Post Post

8 11 8 12 8 8 6 8 5 6 35 45

Anecdotal Data:

When asked: Do you think that learning how to troubleshoot a computer helped you

with your problem solving skills? Why or why not?

Her Response:

"Yes. Because you had to figure out what the problem is and you have to think of the

solution of the problem."

Student 3 Control

Subject #3 was a female student in the control group, who was rated by her

teacher as a medium ability problem solver. The student had good attendance

throughout the school year. The student was matched up with student number 4 in

the experimental group. The student was shy, quiet and reserved during all periods of

assessment. The student quickly solved the easy hands-on problem solving post-test,

improving her completion time by 30 seconds. During both part of the post-test for
63
the hands-on problem solving test, the student had fewer attempts on the problem.

She also was able to solve the difficult hands-on problem solving tests during the

post-test. Her IOWA score improved two points, which was a typical result. The

student's results on the POPS post-test increased in the Correctness of Answer,

Methods Used and Extracting Information sections, improving her total POPS score

by six points. Below is documentation of her scores throughout the project.

Table 4.4a

Student Identification and IOWA Results

Student Group Gender Teacher-Rated Problem Solving IOWA Pre IOWA Post IOWA Difference

Ability

3 C F Medium 22 24 2

Table 4.4b

Easy Hands-On Problem Solving Results

Pre: Attempts Post: Attempts Pre: Time Post: Time Pre: Solution Post: Solution

5 1.5 Solved Correctly Solved Correctly

Table 4.4c

Difficult Hands-On Problem Solving Results

Pre: Attempts Post: Attempts Pre: Time Post: Time Pre: Solution Post: Solution

8 2 5 5 Not Solved Solved Correctly


64
Table 4.4d

POPS- Profiles of Problem Solving

COA: Correctness of Answer

MU: Methods Used

A: Accuracy

EI: Extracting Information

QE: Quality of Explanation

COA: Pre COA: MU: Pre MU: A: Pre A: EI: El: QE: Pre QE: Total: Pre Total: Post

Post Post Post Pre Post Post

5 9 7 8 6 6 5 6 6 6 29 35

Student 4 Experimental

Subject #4 was a female student in the experimental group, who was rated by

her teacher as a medium-ability problem solver. The student had good attendance

throughout the school year. The student was matched up with student number 3 in

the control group. The student was outgoing, methodical and confident during all

periods of assessment. Although there is no documentation, the researcher believed

this student to be the most knowledgeable of computer information prior to the

training sessions. The student was occasionally too eager to participate in the hands­

on segment of the training and constantly interrupted the instructor to ask questions.

The student stayed constant in both parts of the pre- and post- hands-on problem

solving test, solving both correctly. She also maintained similar number of attempts

and time on the both parts of the pre- and post- hands-on problem solving tests,

however, she took more time to solve the difficult part during the post-test. She did
65
not improve on the IOWA test. The student's results on the POPS test increased in

the Methods Used, Accuracy, Extracting Information and Quality of Explanation

sections, while decreasing her scores on the Correctness of Answer section. She

improved her total POPS score by nine points. Below is documentation of her scores

throughout the project.

Table 4.5a

Student Identification and IOWA Results

Student Group Gender Teacher-Rated Problem-Solving IOWA Pre IOWA Post IOWA Difference

Ability

4 E F Medium 20 20 0

Table 4.5b

Easy Hands-On Problem Solving Results

Pre: Attempts Post: Attempts Pre: Time Post: Time Pre: Solution Post: Solution

2 1.25 Solved Correctly Solved Correctly

Table 4.5c

Difficult Hands-On Problem Solving Results

Pre: Attempts Post: Attempts Pre: Time Post: Time Pre: Solution Post: Solution

2 2 2 4 Solved Correctly Solved Correctly


66
Table 4.5d

POPS- Profiles of Problem Solving

COA: Correctness of Answer

MU: Methods Used

A: Accuracy

El: Extracting Information

QE: Quality of Explanation

COA: Pre COA: MU: Pre MU: A: Pre A: EI: EI: QE: Pre QE: Total: Pre Total: Post

Post Post Post Pre Post Post

6 5 4 7 5 6 3 6 3 6 21 30

Anecdotal Data:

When asked: Do you think that learning how to troubleshoot a computer helped you

with your problem solving skills? Why or why not?

Her response:

"Yes. Because you learned how to fix things easier when we did the computer. I

think yes because we actually learned like, cause you didn't help us that much. You

just kind of took apart the computer and we had to think of all the parts that were

missing and stuff. We did it in like art. The printer wasn't working; it was the same

problem here. So we pressed the button and it worked."

Student 5 Control

Subject #5 was a female student in the control group, who was rated by her

teacher as a high-ability problem solver. The student had good attendance throughout

the school year. The student was matched up with student number 2 in the control
67
group. The student was shy, quiet and reserved during all periods of assessment.

The student stayed constant in both sections of the pre- and post- hands-on problem

solving test. She did use a multiple attempt approach in the easy section of the post­

hands-on problem solving test, decreasing her time by one minute and fifteen

seconds. Her IOWA score improved one point, which was a typical result. The

student's results on the POPS test increased in the Correctness of Answer and

Accuracy sections, while decreasing in the Extracting Information, improving her

total POPS score by one point. Below is documentation of her scores throughout the

project.

Table 4.6a

Student Identification and IOWA Results

Student Group Gender Problem Solving Rating by IOWA Pre IOWA Post IOWA Difference

Teachers

5 C F High 21 22

Table 4.6b

Easy Hands-On Problem Solving Results

Pre: Attempts Post: Attempts Pre: Time Post: Time Pre: Solution Post: Solution

3 2.5 1.25 Solved Correctly Solved Correctly

Table 4.6c

Difficult Hands-On Problem Solving Results

Pre: Attempts Post: Attempts Pre: Time Post: Time Pre: Solution Post: Solution

2 2 5 5 Not Solved Not Solved


68
Table 4.6d

POPS- Profiles of Problem Solving

COA: Correctness of Answer

MU: Methods Used

A: Accuracy

EI: Extracting Information

QE: Quality of Explanation

COA: Pre COA: MU: Pre MU: A: Pre A: EI: EI: QE: Pre QE: Total: Pre Total: Post

Post Post Post Pre Post Post

5 6 6 6 5 8 6 6 5 2 27 28

Student 6 Experimental

Subject #6 was a female student in the experimental group, who was rated by

her teacher as a high-ability problem solver. The student had good attendance

throughout the school year. The student was matched up with student number 1 in

the control group. The student was shy, quiet and reserved during all periods of

assessment. During the training sessions, the student would not volunteer at first.

The instructor would ask student #6 for an answer, and usually the student would

know the answer. The student was meek, but willing to take part in the hands-on

aspect of the training, being easily pushed aside by other students. The student was

one of the most improved students in the pre- and post- hands-on problem solving

test. On both tests, she was able to improve her time and solved the difficult section

during the post-test. She improved two points on the IOWA test, which was a typical

result of the subjects. The student's results on the POPS test increased in the

Correctness of Answer, Methods Used and Accuracy sections, while decreasing in


69
Quality of Explanation section. The student improved her POPS total score by six

points. Below is documentation of her scores throughout the project.

Table 4.7a

Student Identification and IOWA Results

Student Group Gender Problem Solving Rating by IOWA Pre rowA Post rowA Difference
Teachers

6 E F High 20 22 2

Table 4.7b

Easy Hands-On Problem Solving Results

Pre: Attempts Post: Attempts Pre: Time Post: Time Pre: Solution Post: Solution

2 3 1.75 1.25 Solved Correctly Solved Correctly

Table 4.7c

Difficult Hands-On Problem Solving Results

Pre: Attempts Post: Attempts Pre: Time Post: Time Pre: Solution Post: Solution

3 2 5 4.5 Not Solved Solved Correctly


70
Table 4.7d

POPS- Profiles of Problem Solving

COA: Correctness of Answer

MU: Methods Used

A: Accuracy

El: Extracting Information

QE: Quality of Explanation

COA: Pre COA: MU: Pre MU: A: Pre A: EI: EI: QE: Pre QE: Total: Pre Total: Post

Post Post Post Pre Post Post

7 11 6 8 7 8 8 8 5 4 33 39

Anecdotal Data:

When asked: Do you think that learning how to troubleshoot a computer helped you

with your problem solving skills? Why or why not?

Her response:

"I think it helped by learning strategies like in math."

Male

Student 7 Control

Subject #7 was a male student in the control group, who was rated by his

teacher as a low-ability problem solver. The student had good attendance throughout

the school year. The student was matched up with student number 12 in the

experimental group. The student was quiet, reserved and rushed through all periods

of assessment. The student's results were similar in the pre- and post- hands-on

problem solving test. He felt pressured by the other student completing before him
71
problem solving test. He felt pressured by the other student completing before him

and gave up on the difficult section of the test. He had more attempts during the pre­

tests of both parts than in the post-tests. He notified the instructor he had finished,

when he had never completed the difficult problem. He improved on the IOWA test

by three points, which was a typical score with the subjects. The student's results on

the POPS test decreased in the Correctness of Answer, Methods Used, Accuracy and

Quality of Explanation sections, decreasing his total POPS score by six points. The

researcher believes the reason for the decrease is due to rushed efforts. Below is

documentation of his scores throughout the project.

Table 4.8a

Student Identification and IOWA Results

Student Group Gender Problem Solving Rating by IOWA Pre IOWA Post IOWA Difference

Teachers

7 C M Low 14 17 3

Table 4.8b

Easy Hands-On Problem Solving Results

Pre: Attempts Post: Attempts Pre: Time Post: Time Pre: Solution Post: Solution

4 2 1.5 1.25 Solved Correctly Solved Correctly

Table 4.8c

Difficult Hands-On Problem Solving Results

Pre: Attempts Post: Attempts Pre: Time Post: Time Pre: Solution Post: Solution

6 2 4.75 1.75 Solved Incorrectly Solved Incorrectly


72
Table 4.8d

POPS- Profiles of Problem Solving

COA: Correctness of Answer

MU: Methods Used

A: Accuracy

EI: Extracting Information

QE: Quality of Explanation

COA: Pre COA: MU: Pre MU: A: Pre A: El: EI: QE: Pre QE: Total: Pre Total: Post

Post Post Post Pre Post Post

2 3 4 3 5 2 5 2 17 11

Student 8 Experimental

Subject #8 was a male student in the experimental group, who was rated by

his teacher as a high-ability problem solver. The student had good attendance

throughout the school year. The student was matched up with student number 9 in

the experimental group. The student was interested, quiet and well-behaved during

all periods of training and assessment. The student asked intelligent questions and

volunteered multiple answers during group discussions. The student improved in

both sections of the post- hands-on problem solving test. He was able to solve the

problem in the difficult section of the post-test, which he was not able to successfully

complete in the pre-test. His completion time on both the easy section and the

difficult section also improved in the post-test. His score on the IOWA test did not

change, which was typical for all subjects. The student's results on the POPS test

increased in the Correctness of Answer, Methods Used, Accuracy and Extracting


73
Information sections, improving his total POPS score by nine points. Below is

documentation of his scores throughout the project.

Table 4.9a

Student Identification and IOWA Results

Student Group Gender Problem Solving Rating by IOWA Pre IOWA Post IOWA Difference

Teachers

8 E M High 23 23 0

Table 4.9b

Easy Hands-On Problem Solving Results

Pre: Attempts Post: Attempts Pre: Time Post: Time Pre: Solution Post: Solution

4 4.75 4 Solved Correctly Solved Correctly

Table 4.9c

Difficult Hands-On Problem Solving Results

Pre: Attempts Post: Attempts Pre: Time Post: Time Pre: Solution Post: Solution

5 4.75 Not Solved Solved Correctly


74
Table 4.9d

POPS- Profiles of Problem Solving

COA: Correctness of Answer

MU: Methods Used

A: Accuracy

El: Extracting Information

QE: Quality of Explanation

COA: Pre COA: MU: Pre MU: A: Pre A: El: El: QE: Pre QE: Total: Pre Total: Post

Post Post Post Pre Post Post

3 6 5 8 6 8 5 6 4 4 23 32

Anecdotal Data:

When asked: Do you think that learning how to troubleshoot a computer helped you

with your problem solving skills? Why or why not?

His response:

"Yes I think troubleshooting will make a difference because finding out what's wrong

with a computer is a lot like finding out what the solution is in a question. I had to

figure out what was wrong with the computer. And it was a lot like trying to figure

out the problem."

Student 9 Control

Subject #9 was a male student in the control group, who was rated by his

teacher as a high-ability problem solver. The student had good attendance throughout

the school year. The student was matched up with student number 8 in the

experimental group. The student was energetic, rambunctious and rushed during all

periods of assessment. The student had a good attitude and sought out attention in
75
many different forms. The student solved the easy section faster in post- hands-on

problem solving test than in the pre-test. The student solved the difficult section

problem correctly in the pre-test, but was unable to solve the problem in the post-test.

He received the same score on the IOWA pre-test as he did on the IOWA post-test.

The student's results on the POPS test decreased in every section, decreasing the total

POPS score by 8 points total. Below is documentation of his scores throughout the

project.

Table 4.10a

Student Identification and IOWA Results

Student Group Gender Problem Solving Rating by IOWA Pre IOWA ·Post IOWA Difference

Teachers

9 C M High 20 20 0

Table 4.10b

Easy Hands-On Problem Solving Results

Pre: Attempts Post: Attempts Pre: Time Post: Time Pre: Solution Post: Solution

2 .5 Solved Correctly Solved Correctly

Table 4.10c

Difficult Hands-On Problem Solving Results

Pre: Attempts Post: Attempts Pre: Time Post: Time Pre: Solution Post: Solution

5 4 4.75 5 Solved Correctly Not Solved


76
Table 4.10d

POPS- Profiles of Problem Solving

COA: Correctness of Answer

MU: Methods Used

A: Accuracy

EI: Extracting Information

QE: Quality of Explanation

COA: Pre COA: MU: Pre MU: A: Pre A: EI: EI: QE: Pre QE: Total: Pre Total: Post

Post Post Post Pre Post Post

9 8 8 7 8 6 6 5 4 35 27

Student 10 Experimental

Subject #10 was a male student in the experimental group, who was rated by

his teacher as a high-ability problem solver. The student had good attendance

throughout the school year. The student was matched up with student number 11 in

the control group. The student was unmotivated, quiet and reserved during all

periods of assessment and training. The student stayed consistent many aspects of

both sections of the pre- and post- hands-on problem solving test. He maintained

similar attempts, times and solved the easy section correctly in the pre and post-tests.

However, he solved the difficult section during the pre-test, but was unable to

complete the problem during the post-test. There was noise and distractions during

his hands-on problem solving post-test. He improved his IOWA score by 3 points,

which is a typical improvement. The student's results on the POPS test increased in

the Methods Used and the Quality of Explanation sections, while decreasing in the
77
Correctness of Answer sections. The student improved his total POPS score by

one point. Below is documentation of his scores throughout the project.

Table 4.1 la

Student Identification and IOWA Results

Student Group Gender Problem Solving Rating by IOWA Pre IOWA Post IOWA Difference

Teachers

10 E M High 16 19 3

Table 4.1 lb

Easy Hands-On Problem Solving Results

Pre: Attempts Post: Attempts Pre: Time Post: Time Pre: Solution Post: Solution

4 4 4.25 4 Solved Correctly Solved Correctly

Table 4.llc

Difficult Hands-On Problem Solving Results

Pre: Attempts Post: Attempts Pre: Time Post: Time Pre: Solution Post: Solution

5 5 5 Solved Correctly Not Solved


78
Table 4.1 ld

POPS- Profiles of Problem Solving

COA: Correctness of Answer

MU: Methods Used

A: Accuracy

EI: Extracting Information

QE: Quality of Explanation

COA: Pre COA: MU: Pre MU: A: Pre A: EI: EI: QE: Pre QE: Total: Pre Total: Post

Post Post Post Pre Post Post

5 4 4 5 4 4 7 7 2 3 22 23

Student 11 Control

Subject #11 was a male student in the control group, who was rated by his

teacher as a high-ability problem solver. The student had good attendance throughout

the school year. The student was matched up with student number 10 in the

experimental group. The student was outgoing and well-behaved during all periods

of assessment. He was the first to finish every test. The student took less time to

complete the easy section of the hands-on problem solving test. He was unable to

solve the difficult problem in the pre- and post- hands-on problem solving test. He

improved his IOWA score by one point, which is typical of all the subjects. The

student's results on the POPS test increased in the Correctness of Answer and

Extracting Information sections, improving his total POPS score by five points.

Below is documentation of his scores throughout the project.


79
Table 4.12a

Student Identification and IOWA Results

Student Group Gender Problem Solving Rating by IOWA Pre IOWA Post IOWA Difference

Teachers

11 C M High 22 23

Table 4.12b

Easy Hands-On Problem Solving Results

Pre: Attempts Post: Attempts Pre: Time Post: Time Pre: Solution Post: Solution

2 1.25 .5 Solved Correctly Solved Correctly

Table 4.12c

Difficult Hands-On Problem Solving Results

Pre: Attempts Post: Attempts Pre: Time Post: Time Pre: Solution Post: Solution

5 3 5 5 Not Solved Not Solved

Table 4.12d

POPS- Profiles of Problem Solving

COA: Correctness of Answer

MU: Methods Used

A: Accuracy

EI: Extracting Information

QE: Quality of Explanation

COA: Pre COA: MU: Pre MU: A: Pre A: El: EI: QE: Pre QE: Total: Pre Total: Post

Post Post Post Pre Post Post

7 10 7 7 8 8 6 8 3 3 31 36
80
Student 12 Experimental

Subject #12 was a male student in the experimental group, who was rated by

his teacher as a low-ability problem solver. The student had good attendance

throughout the school year. The student was matched up with student number 7 in

the control group. The student was quiet and reserved during all periods of

assessment and training. The student was motivated to begin the training and was

excited by the hands-on aspect of the training. The student was classified as Leaming

Disabled. The student improved his time in the easy section of the hands-on problem

solving test. He was also able to solve the difficult problem correctly during the

post-test, as opposed to the pre-test. He improved his score on the IOWA test by

three points, which is typical of the students within this study. The student's results

on the POPS test increased in the Methods Used and Accuracy sections, but

decreased in the Correctness of Answer and Extracting Information sections. His

overall POPS total score decreased by one point. Below is documentation of his

scores throughout the project.

Table 4.13a

Student Identification and IOWA Results

Student Group Gender Problem Solving Rating by IOWA Pre IOWA Post IOWA Difference

Teachers

12 E M Low 16 19 3

Table 4.13b

Easy Hands-On Problem Solving Results

Pre: Attempts Post: Attempts Pre: Time Post: Time Pre: Solution Post: Solution
81
.5 Solved Correctly Solved Correctly

Table 4.13c

Difficult Hands-On Problem Solving Results

Pre: Attempts Post: Attempts Pre: Time Post: Time Pre: Solution Post: Solution

2 2 5 5 Not Solved Solved Correctly

Table 4.13d

POPS- Profiles of Problem Solving

COA: Correctness of Answer

MU: Methods Used

A: Accuracy

EI: Extracting Information

QE: Quality of Explanation

COA: Pre COA: MU: Pre MU: A: Pre A: EI: EI: QE: Pre QE: Total: Pre Total: Post

Post Post Post Pre Post Post

5 2 3 5 3 4 6 5 18 17

Description of Findings Pertinent to Hypothesis

Hypothesis 1

Elementary students can develop knowledge through computer troubleshooting in

order to solve common computer problems.


82
Troubleshooting Activity

The final troubleshooting activity was set-up m seven stations. At each

station, there was a separate computer problem the students were required to solve.

The students were separated into teams of two and were given a sheet for each

station. The student would then assess the problem, fix the problem and describe the

solution on the worksheet. The students solved all problems they encountered (See

Table 4.14).

Table 4.14

Computer Troubleshooting Activity Results

Station Station Problem Team1 Team2 Team3


Number Title Solution Solution Solution
#4: F, Med #2: F, High #6: F, High
#8: M, High #12:M, Low #10:M, High
#1 Printer on Main Source: Solved All Solved All Solved All
the Mac Printer
Specifics:
No Paper, Power
Off, Power Cable
to Printer, Cable
from Computer to
Printer, Power
Cable to Power
Strip
#2 MAC Main Source: Solved All Solved All Unable to Stop
Number Program at Station due to
Two Installation Time
Specifics: Constraints
Mouse/Keyboard
Unplugged from
Computer,
Computer Program
Installation
#3 Open Box Main Source: Solved All Solved All Solved All
with a Monitor
Black Specifics:
Screen Power Cable for
Monitor, Power
On, Power Cable
for Tower
Table 4.14 - continued
83
Table 4.14- continued

#4 What's Main Source: Solved All Solved All Solved All


Wrong Minor Pieces
with this removed/unplugged
Box? Specifics:
Mouse Trackball
removed, RAM
removed, Monitor
Unplugged from
Tower, Sound
unplugged from
Card
#5 Example This was an Visited Did not Visit Did not Visit
Box example box. No
problem was
presented here.

#6 Laptop Main Source: Solved All Solved All Solved All


Trauma Missing File
Specifics:
Change the desktop
picture
#7 Trouble Main Source: Solved All as Solved All as Solved All as
with Printer one group, one group, one group, due
Laptop Driver/Installation due to time due to time to time
Printing Specifics: constraints. constraints. constraints.
Laptop was missing
printer
software/driver

Station #1

At station #1, titled "Printer on the Mac", the printer was the main source of

the problem. The specific problems with the printer were: the printer was lacking

paper, the power was turned off, the power cable was not connected to the printer, the

cable from computer to the printer was not connected and the power cord was not

plugged into the power strip. Teams wrote the following responses:
84
Team #1: "Plug in printer and put in paper."

Team #2: First it wasn't plugged in and the USB wasn't plugged in. There was no

paper. The printer was not plugged in to the [monitor]."

Team #3: "What's wrong with your printer is the plug wasn't in and there was no

paper. So you need to put some in."

All three teams solved all the problems successfully without guidance from

the instructor. All teams printed a document after correcting the problems, and

presented it to the instructor.

Station #2

At station #2, titled "MAC Number Two", the students were required to install

a program. However, the mouse and keyboard were both unplugged. The students

needed to first correct this problem, and then move on to install the program. Each

group received a different program to install because the removal of each program

would have taken too much time. Teams wrote the following responses:

Team #1: "Plug in any plugs, put in CD and pushed yes, You put the CD in and

clicked on yes to install it."

Team #2: "One problem we had was the mouse was not plugged in. We first went to

installer then we pushed continue and it installed them. We restarted the computer."

Team #3: Did not complete due to time constraints.


85
Only two groups completed this station due to time constraints. All teams

successfully installed a program after connecting the peripheral devices. The

instructor checked each installation.

Station #3

At station #3, titled "Open Box with a Black Screen", the students were

required to fix minor problems with the monitor. The monitor power cable was

unplugged, the power button on the monitor was turned off, and the monitor cable

was not plugged into the system tower. Teams wrote the following responses:

Team #1: "Monitor won't tum on because it had no power and it was not plugged in

to the power tower."

Team #2: "The monitor is not working. It is not working because the [monitor] is

not plugged into the power tower."

Team #3: "[The problem is the] monitor won't tum on because the power cable isn't

plugged into the monitor."

All three teams solved all the problems correctly without guidance or

clarification from the instructor. All teams successfully turned on the monitor after

correcting the problems, and presented the lighted screen to the instructor.

. Station #4

At station #4, titled "What's Wrong with this Box?", the students were

required to look at powerless system tower, with the cover taken off. They were

obligated to look over the entire system to find which components were missing or
86
unattached. The mouse trackball was removed, the mouse was unplugged from the

system tower, one of the RAM memory pieces was removed, the monitor was

unplugged from system tower, the power button was removed and the sound cord was

unplugged from sound card. Teams responded with the following responses:

Team #1: "The sound cable is not plugged in. The ram is missing. The trackball for

the mouse isn't in. Plug in [the] mouse, keyboard and monitor. The power cable is

not plugged in.

Team #2: "The ram is missing (1). The wire is not plugged in. The p5 ( the internal

power cord) wire is not plugged in. The [monitor] is not plugged into the power

tower. The mouse is not plugged in. The trackball is missing. The power button is

not there. The power cord."

Team #3: "Sound cable. More ram. Trackball. Plug in mouse and keyboard and

monitor and power cable plug in."

Students were also asked if the computer was turned on, how the system

would be affected. Team #2 did not respond. Other teams wrote the following

responses:

Team # 1 : "You could not do anything or hear anything."

Team #3: "You could [not do] anything or [hear] anything."

All three teams solved all the problems without guidance or clarification from

the instructor. All teams successfully reassembled the computer by asking the

instructor for each part missing. The students physically installed the missing RAM
87
and plug in the sound cord to the sound card, as well as attach all missing

peripheral devices.

Station #5

At station #5, titled "EXAMPLE BOX", students were just provided with an

example computer set-up, in case they wished to use it for an example. Team #1 was

the only team to visit this station to look over the example.

Station #6

At station #6, titled "Laptop Trauma", students were required to perform two

tasks. First, students needed to find a missing file titled "Lost Dog", by using the

search function within the Windows Operating System. The file was taken off the

Recent Documents menu to ensure the students were using the search function.

Teams wrote the following responses:

Team #1: "It is under Microsoft Word."

Team #2: "Yes we found it. We went to search and pushed files and folders then we

typed in lost dog."

Team #3: "It was in Microsoft Word."

The second part of station #6 required students to change the background

picture on the desktop. Students needed to use the properties menu by right clicking

on the desktop and selecting properties. Within the Display Properties, students

selected a different background. The instructor verified successful completion by


88
noticing the changed background w4en notified of their conclusion. Teams wrote

the following responses:

Team # 1: "You right click anywhere then you click properties. Then you go to

desktop and change the background."

Team #2: "Yes. We right clicked then we went to properties. Then we clicked on

desktop and changed it."

Team #3: "Yes. I went under desktop and found it."

All three teams found the missing file and change the desktop background

without guidance from the instructor. All teams used the search function within the

Windows Operating System to find the missing file. All teams changed the

background by using the Display Properties menu.

Station #7

At station #7, titled "Trouble with Laptop Printing", students were asked if

they could print from a laptop by simply plugging the printer into the laptop. There

was no printer software installed on the desktop, so the laptop would not have been

able to print from the printer. Teams wrote the following responses:

Team #1: "No, because you have to install it."

Team #2: "No because after you hooked it up to the laptop you need to install it."

Team #3: Students visited this station and successfully solved the problem, but did

not respond.
89
All three teams solved all the problems correctly without guidance or

clarification from the instructor. The students all solved this problem together due to

time constraints. Two teams answered the question on their worksheet.

Out of all the problems students attempted during the final troubleshooting

activity, students solved every problem successfully without aid from the instructor.

Throughout the troubleshooting activity, students followed a conventional problem

solving process. The teams identified the common computer problems, and wrote the

problems out on their team worksheet. They then proceeded to devise a plan or

strategy to fix the problem. Teams correctly fixed/solved the problems, by

reattaching devices or installing components. They were able to look back and verify

their answers by accomplishing the task and receive feedback from the fixed

machine. The computer was able to provide automatic feedback, as to whether the

problem was solved. Students solved all common computer problems presented by

the troubleshooting activity.

Group Interview

During the group interview, students solved a computer problem for a teacher

within one day of the troubleshooting activity. The students were asked if they

believed they could solve common computer problems. One student in the

experimental group commented "we [solved a common computer problem] in like art.

The printer wasn't working; it was the same problem here. So we pressed the

[power] button and [the printer] worked." The students in the experimental group

told the control group about the different computer troubleshooting stations they were

able to fix (See Table 4.15).


90
Table 4.15

Excerpt From Group Interview - Transcribed Conversation

RESEARCHER Why don't you guys try to tell them a little bit about what we
did?
#4 (experimental) The first couple days we were just studying like what parts of
the computers were the computers. We took apart the
computers and um ... and put them back together. Then like the
last day, she took apart a computer and we had to put it back
together with all the parts.
#6 (experimental) We had to go to like stations and we had to figure out what it
was and fix it.
RESEARCHER You had to fix it. You had to figure out what it was first and
then you had to fix it. So you had to identify the problem.
#2 (experimental) We had to install and uninstall a program.
RESEARCHER Did the people who went through the computer troubleshooting,
did you guys have fun doing that?
All Yeah.
RESEARCHER Do you think that if your teacher had a problem with the
computer that you could fix it?
All Yeah.
RESEARCHER So now you can help your teacher out in lab?
#4 (experimental) We did it in like art. The printer wasn't working; it was the
same problem here. So we pressed the button and it worked.
RESEARCHER Alright there you go. That's fantastic guys! Thanks a lot. I
really appreciate it.

Based on responses, the students felt comfortable in solving common

computer problems. All students in the experimental group volunteered answers

during group discussions, and were anxious to solve the problems, getting their hands
91
on the hardware and the computers. Students encountered all problems with

enthusiasm, questions and logical progression of problem solving.

Hypothesis 2

Elementary students who participate in the computer troubleshooting curriculum will

improve problem solving methods compared to elementary students in the control

group.

POPS -Profiles of Problem Solving Test

Within the POPS test, there were five categories to assess the different

elements of problem solving. According to the POPS teacher's manual, the method

used category contained activities related to problem solving strategies such as;

working systematically, listing possibilities, finding and using patterns and

generalizing. The control group table (Table 4.16) shows the difference between pre

and post scores (See Figure 4.1). The experimental group table (Table 4.17) shows

the improvement between pre and post scores (See Figure 4.2).

Table 4.16

Control Group Methods Used Section

Student Methods Used Pre-Test Methods Used Post-Test Methods Used Score
Score Score Difference
#1 4 4 0
#3 7 8 1
#5 6 6 0
#7 4 3 -1
#9 8 7 -1
#11 7 7 0
AVERAGE 6 5.833333 -0.16667
92
Figure 4.1

Control Group Methods Used Section Graphed

Control Group Methods Used Section of POPS

10 -....,....,__,__,__.......,.,,.,....._______,,.._____...,..,,.,........._......,
8+------

6-r=----tJ:ZV □ Methods Used Pre-Test


4 ■ Methods Used Post-Test
2
0 '
3 5 7 9 11
Students

Table 4.17

Experimental Group Methods Used Section

Student Methods Used Pre-Test Methods Used Post-Test Methods Used Score
Score Score Difference
#2 8 12 4
#4 4 7 3
#6 6 8 2
#8 5 8 3
#10 4 5
#12 3 5 2
AVERAGE 5 7.5 2.5
93
Figure 4.2

Experimental Group Methods Used Section Graphed

Experimental Group Methods Used Section of POPS

14
12
10
□ Methods Used A"e-Test
■ Methods Used R::lst-Test
4
2

4 6 8 10 12

Student

Only one student, student #3, in the control group improved in the methods

used category. All other control group students either remained constant or decreased

their score. The experimental students all improved their score in the methods used

category by one or more points. On average, students in the experimental group

improved 2.5 points in the methods used category between the pre- and post-tests.

Students in the control group, on average, received .17 less points on the post-test, as

compared to the pre-test.

Hands-On Problem Solving Test

To evaluate the methods students used, the hands-on problem solving activity

was videotaped and assessed at a later time. The number of attempts for each

problem during the pre-tests and the post-tests were analyzed, as well as the time

needed to successfully complete the problem. Documentation of how the students

attempted to solve the problem was also recorded from the videotape.
94
Figure 4.3

Comparing Differences of Groups with Number of Attempts in the Hands-On Pre­

Test versus Post-Test

Comparing Differences of Groups wi th Number of Attempts in the Easy


Problem in the Hands-On Pre -Test versus Post-Test

� 3 _,_,,,_,,,__..,....,..._,,,,,,,_,,,..,,._,.,,,,__,_,...__.....,.....__

f� 2.5 -- -+-control Group Average


2
< Attempts
'o 1.5 +""�-=�-=�-����
-Experimental Group
� ]-i-----------------�
Average At tempts
"E::i 0.5 +-------------=------!
z o�=-=-=--=-----..-=-=--=---�
ATTEMPT ATTEMPT

Easy: Pre - Test Easy: Post-Test

Comparing Differences of Groups wi th Number of Attempts in the


Difficult Problem in the Hands-On Pre-Test versus Post-Test

6 --,-..,.,,.,..,...,.,.,..,,_.....,.......,.,.,,.,,_,..........,.,,...,.,,,..........,.......,,.,,..,........,_,..............,,,,......,.,.
R5--i-----��----------�
s
� 4+""----��"-1.:=-----r----l �Control Group Average
Attempts
'o 3
-Experimental Group

-.i---------,.......,....-----=----....-1
2
"a::i Average At tempts
II)

z
l
0 --1---.................----�,................ �--�----1
ATTEMPT ATTEMPT

Difficult: Pre-Test Difficu It: Post-Test

The control group on average (See Figure 4.3) showed a 49% decrease in

number of attempts from the easy problem pre-test to the easy problem post-test. The

control group on average also decreased in number of attempts by 55% from the

difficult problem pre-test to the difficult problem post-test. The experimental group
95
on average increased the number of attempts by 13% on the easy problem pre-test

to the easy problem post-test. The experimental group on average also increased in

number of attempts on the difficult problem from the pre-test to the post-test by 23%.

Therefore, the experimental group, on average, increased the number of attempts,

while the control group, on average, decreased in the number of attempts.

All students solved the easy problem in less than the specified five-minute

time period; therefore time comparisons can be made between groups (See Table

4.18).

Figure 4.4

Time Results from the Easy Problem in the Hands-on Pre- and Post-Test

Time to Complete Easy Problem in Hands-On Test

3
j 2.5
:i

·e
C:
...,_Pre-Test
1 5
C: -Post-Test
'; 1
i= 0.5

Control Experimental
Group

Table 4.18

Time Results from the Easy Problem in the Hands-on Pre- and Post-Test

Group Pre-Test Average Time Post-Test Average Time Difference in Time


Between Pre-Test and
Post-Test
Control 1.38 1.0 0.38
Experimental 2.54 2.21 0.33
96
The difference between the control group's average improvement in time

and the experimental group's average improvement in time was 0.042 minutes,

amounting to 2.8 seconds. Due to the small numbers, the difference is not significant.

The difficult problem was much more complicated and many students were

unable to solve the problem. The following table shows students who completed the

difficult problem during either the pre-test, post-test or both as indicated below (See

Table 4.19).

Table 4.19

Control Students' Pre-Test and Post-Test Ability to Solve the Difficult Problem in the
Hands-On Problem Solving Test

Student Group Pre-Test Solution Post-Test Solution


#1 Control Not Solved Solved Correctly
#3 Control Not Solved Solved Correctly
#5 Control Not Solved Not Solved
#7 Control Solved Incorrectly Solved Incorrectly
#9 Control Solved Correctly Not Solved
#11 Control Not Solved Not Solved
Total Control Control 4 Not Solved 3 Not Solved
1 Solved Incorrectly 1 Solved Incorrectly
1 Solved Correctly 2 Solved Correctly

Table 4.20

Experimental Students' Pre-Test and Post-Test Ability to Solve the Difficult Problem
in the Hands-On Problem Solving Test

Student Group Pre-Test Solution Post-Test Solution


#2 Experimental Not Solved Not Solved
#4 Experimental Solved Correctly Solved Correctly
#6 Experimental Not Solved Solved Correctly
#8 Experimental Not Solved Solved Correctly
#10 Experimental Solved Correctly Not Solved
#12 Experimental Not Solved Solved Correctly
Total Experimental Experimental 4 Not Solved 2 Not Solved
2 Solved Correctly 4 Solved Correctly
97

Figure 4.5

Comparing the Number of Students in Each Group not Able to Solve Correctly or
Solve Incorrectly the Difficult Problem in the Hands-On Problem Solving Test

Difficult Problem in Hands-on Test Not Solved or Solved


Incorrectly by Students
Pre- and Post-Test Results Compared Between Groups
.!!l 6 -.......-,,....,...........,,,......._......,.,..,....,.________._...,
� 5 -·1----'4--------------------.if:::���:-;--::-;-��-::-:-;::-::;-:-:-:;---1
"'.a 4 -

0 3-
lii 2 Incorrectly
E 1
:i O -J---.io------
Control Experimental
Group

Figure 4.6

Comparing the Number of Students in Each Group Able to Correctly Solve the
Difficult Problem in the Hands-On Problem Solving Test

Difficult Problem in Hands-on Test Solved Correctly by


Students
Pre- and Post-Test Results Compared Betwee n Groups

- 4+------------
�c3+---ai=-----�--,-,..-_,_.
0 Cl)
EJ R"e-Test: Solved Correctly
1l �
E:::s -:::,2-i----- ■ F\Jst-Test: Solved Correctly
z "'1 -t--::!1-�
0 +--..........................
Control Experimental
Group

Four students in the experimental group solved the difficult problem correctly

during the post-test, whereas only two students in the control group solved the
98
difficult problem during the post-test. The results are not significant because two

students who solved the problem correctly during the pre-test, were unable to solve

the problem during the post-test. Each group contained one student who solved the

problem correctly during the pre-test, but not during the post-test. The control group

also contained one student categorized as solving the problem incorrectly.

Hypothesis 3

The most difficult procedure in problem solving for elementary students 1s to

understand what the question is looking for.

POPS - Profiles of Problem Solving Test

The most difficult part of problem solving was analyzed through a review of

literature and articles. Data was also collected through the Profiles of Problem

Solving test and the surveys/interviews. The standardized problem solving test,

POPS, divided the evaluation into five separate categories; Correctness of Answer,

Methods Used, Accuracy, Extracting Information and Quality of Explanation. Each

category was present in multiple questions and graded separately for each student.

Every category was divided into three possible levels of achievement; beginning,

developing and advanced. Each student was graded on the pre-test and post-test,

using the three possible levels (See Table 4.21)


99
Table 4.21

Each Student's Pre-Test Score on the POPS test Graded on Beginning, Developing or
Advanced Levels of Achievement

B=Beginning, D=Developing, A=Advanced

Student Correctness Method Used Accuracy Extracting Quality of


of Answer Information Explanation
#1 B D B B D
#2 D D A D A
#3 D D D D A
#4 D D D B D
#5 D D A D D
#6 D D A A A
#7 B D B B D
#8 B D A D D
#9 D D D D D
#10 D D D D D
#11 D D A D D
#12 D B B D B
Total: B: 3 B:1 B:3 B: 3 B:1
D:9 D:11 D:4 D:8 D:8
A:0 A:0 A:5 A: 1 A: 3

The most difficult categories for the students were, in order of difficulty;

correctness of answer, extracting information and methods used. In these categories,

there were more students who were in the beginning or developing stages.

Survey/Group Interview

During the survey/group interview, students were asked, "What is the hardest

part about solving a problem?" Written responses and verbally expressed opinions
100
were both recorded and organized (Table 4.22) showing the different difficulties

articulated by students. Students produced answers such as; identifying important

information, understanding what the question is looking for, what to do with the

information, looking back, not enough information and the type of strategy or plan to

use.

Table 4.22

Student Responses to the Most Difficult Problem Solving Process

Answer Identifying Understanding Looking back Lack of Method or


important the question information strategy
information
Frequency 3 8 2 2 3
of students'
answers.

The researcher organized the students' responses into separate categories to

simplify the presentation of the data. Behind each title were key responses or key

words used to define the category. Behind identifying important information, key

phrases such as, "I don't know what to do with the given information", "what

info[rmation] is needed to solve the problem", and "finding all the information"

defined the category. Behind understanding the question, key phrases such as, "don't

understand it", "don't know what the question is asking you to do", and "what the

problem is looking for" defined the category. Behind looking back, key phrases such

as, "knowing if your done" and "finding out the answer" defined the category.

Behind lack of information, key phrases such as "not enough information" defined

the category. Behind method or strategy, key phrases such as, "making a plan", "how

to" and "what type of strategy defined the category.


101
Based on the surveys and group interviews, most students found

understanding the question the most difficult part of problem solving.

Hypothesis 4

Leaming to troubleshooting computer problems will increase mathematical problem

solving ability.

IOWA Test

Students in the experimental group and the control group both were pre-tested

and post-tested using the IOWA Basic Skills math problem solving and data

interpretation 26-item section of the IOWA test. The students were given the

identical test for the pre-assessment and post-assessment.

Figure 4.7

IOWA Scores Compared Between Mean Group Scores on the Pre-Test versus the
Post-Test

IOWA Pre-Test vs. IOWA Post-Test


Compared Between Average Groups Scores

25
20
""''
\'i"''"'iffi
A
�· '" ;y• ,,,. - - �
-
..
••··
... ...

15
10
5
0
Pre: IOWA Post: IOWA
Average Control 19.83 22.17
I

I
. :-:-:-:. :-:-:-:-:•Average Experimental 20.3 22

The mean scores of both groups produced similar results. While students in

the control group improved 2.3 points, the experimental group improved 2.3 points
102
from the pre-test to the post-test. The results provided identical and therefore

showed no evidence of improvement in math problem solving skills resulting from

the computer troubleshooting training sessions.

Group Interview

During the group interview, students made interesting comparisons between

computer troubleshooting and problem solving, some related directly to mathematics.

Students made references to the requirement to "figure out what the problem is" and

"think[ing] of the solution of the problem." In mathematics, story problems require

students to figure out what the problem is and brainstorm what the solution could be

(Paris, 2000). Another student stated that, "finding out [what's] wrong with a

computer is a lot like finding out what the solution is in a question." Another student

wrote, "I think [computer troubleshooting] helped [my problem solving skills] by

learning strategies like in math," directly showing the similarity between computer

troubleshooting and math problem solving for one student.

Hypothesis 5

There will be no effect on problem solving ability when gender 1s taken into

consideration.

The existence of the gender gap in computer usage has been shown through

multiple studies; females lack positive educational experiences with computers

(Burge, 2001). Students were separated by gender in multiple ways, and data was

collected, separated and analyzed by gender. The POPS pre-tests and post-tests total
103
scores were compared, as well as all category pre-tests and post-tests. The hands-

on tests were also analyzed based on gender.

POPS - Profiles of Problem Solving Test

Data analyzed from the POPS total score pre-test and post-test showed

females improving from the pre-test with an average score of 26.5 points to the post­

test with an average score of 32.5 points. The males did not improve and retained a

constant average score of 24.3 points through the pre- and post-test.

Table 4.23

Mean of POPS Pre-Test vs. Post-Test Total Score Comparing Females vs. Males

Gender Pre-Test: Total Score Post-Test: Total Score


F 26.5 32.5
M 24.3 24.3

Data analyzed from the POPS Methods Used section and the Extracting

Information section showed an average female improvement from the pre-test to the

post-test. On average, males also improved; however, the results were not as

significant. The females increased their average score by 22% on the methods used

section, and the males increased their average methods used score by 11%. The

females increased their average score on the extracting information section by 20 %,

and the males increased their average extracting information score by 3%.
104
Figure 4.8

Mean of POPS Pre-Test vs. Post-Test Total Score Comparing Females vs. Males

Gender Differences on POPS Total Score Between Pre-Test


and Post-Test

� 30
0
28 ---+--- Male Average
26 -a- Female Average
0

24
22
20
POPS: Total: Pre POPS: Total: Post

Table 4.24

Comparison of the Methods Used and Extracting Information Categories of the POPS
Test Between Mean Gender Score

Gender Methods Methods Improvement Extracting Extracting Improvement


Used Used m Methods Information Information in Extracting
Pre-Test Post- Used Pre-Test Post-Test Information
Test
Female 5.83 7.50 1.67 5.33 6.67 1.34
Male 5.17 5.83 0.66 5.17 5.33 0.16

Gender differences were also exhibited within groups. The female control

group improved 14% in their total POPS score, while the male control group

decreased their total POPS score by 11%. The female experimental group improved

their total POPS score by 22%, while the male experimental group improved their

total POPS score by 12%. The difference is represented in figure 4.10.


105
Figure 4.9

Comparison of the Methods Used and Extracting Information Categories of the POPS
Test Between Mean Gender Score

Difference Between Gender on the Methods Used


Section of POPS Pre-Test versus Post-Test

8
7.5
7
en 6.5 -+- Males Average
Methods Used Score
·o 6
5.5 ---- Females Average
Methods Used Score
5
4.5
4
POPS: MU: Pre POPS: MU: Post

Difference Between Gender on the Extracting


Information Section of POPS Pre-Test versus Post­
Test

7 ,......--=----....,..._,.,....,......-....,,...,..,.._.., -+- Males Average


6.5 -+------------,,.....:!!!!..---I
Extracting
Cl) 6 _.....,__,____ Information Score
� 5.5 +,,,.,,----. ---- Females Average
ll. 5 -
4.5
4
+------------------1
+""'-----...,......-----�
Extracting
Information Score

POPS: El: Pre POPS: El: Post

Table 4.25

Gender Comparisons Divided by Group of Total POPS Score on the Pre- and Post­
Tests

Gender Group Total POPS Total POPS Total POPS


Score Pre-Test Score Post-Test Difference Btw.
Pre-Test and
Post-Test
Female Control 23.3 27.0 3.7
Experimental 29.7 38.0 8.3
Male Control 27.7 24.7 -3.0
Experimental 21.0 24.0 3.0
106
Figure 4.10

Percentage Improvement Between the Pre-Assessment and Post-Assessment of the


Total POPS Score Separated by Gender and Group

Percentage Improvement in Total POPS Score


Separated by Gender and Group

0.25
0.2
0.15
Percentage of
Improvement 0.1
Between Pre- 0.05
Test and Post- 0
Test
-0.05
-0.1
-0.15 Awrage
Female Female A\erage
Control E,cperimental Female Male
Series1 0.14 0.22 0.18 -0.11 0.12 0.005

On average, the females increased their total POPS scores by 18% between

the pre-test and the post-test, whereas the males were only able to increase their total

POPS score by 0.5% on average. When gender was analyzed by groups, females in

the experimental group improved on a larger scale than the females in the control

group. Likewise, in the male gender, when separated by groups, males in the

experimental group improved, whereas males in the control group decreased in their

total POPS score.

The average score for the females was higher than the males in every category

within the POPS test. Females outscored the males in number of points improved in

all categories as well.


107
Figure 4.11

Difference Between POPS Pre-Assessment and Post-Assessment Score on the


Correctness of Answer Category Separated by Gender

Difference Between Gender on the


Correctness of Answer Section of POPS Pre­
Test versus Post-Test

7
.
5
7 +---"____ ____.._.........,..�;g;;:;..-� -+- Males Average
Correctness of

r=��=====:::::J
6 +---------,..---------!
.
� 5 -,,,.-.
6 t---------------1 Answer Score
-a- Females Average
C:

.
·o
C. 555 Correctness of
4.5 +----------------! Answer Score
4 +---"""""--"""""-..,................---�
POPS: COA Pre POPS: COA Post

Figure 4.12
----------
Difference Between POPS Pre-Assessment and Post-Assessment Score on the
Accuracy Category Separated by Gender

Difference Between Gender on the Accuracy


Section of POPS Pre-Test versus Post-Test

7 T""'��==--"""ilE--"'TI'lj
-
6.5 +----"--------i---i----::-,.,,.,,,,:.;::.....c.;.;._____
6 ------- -+- Males Average
....�
Cl)

5.5 +-......,_____::.,_::::::=---...:::::::::::::::::------1 Accuracy Score


-a- Females Average
Accuracy Score
4.5 -t-,--.-,�,.,..-,;.=---"--"------i!P'--�---"----T"rl
4--1-------------,----------1
POPS: A: Pre POPS: A: Post
108
Figure 4.13

Difference Between POPS Pre-Assessment and Post-Assessment Score on the


Quality of Explanation Category Separated by Gender

Difference Between Gender on the Quality of


Explanation Section of POPS Pre-Test versus
Post-Test

4.5
4 +----------------- -+- Males Average
Quality of Explanation

-i------.._;;;.....,.�-------------------� ---
Cl)
3.5 Score
Q. 3 Females Average
Quality of Explanation
2.5 Score
2
POPS: QE: Pre POPS: QE: Post

Throughout the POPS test categories, females improved more than the males

(See Figure 4.11, 4.12, 4.13). Within the experimental group, the female

experimental subjects improved more than the male experimental subjects. The

female experimental subjects improved in their total POPS score from the pre-test to

the post-test by 22%, while the male experimental subjects only improved by 10%.

Both genders in the experimental group outscored their peers in the control group,

showing the training made an impact regardless of gender. However, females in the

control group and the experimental group were both improved their total POPS score

average, although the experimental group improved 8% more than the control group.

Males in the control group decreased their total POPS score, while the males in the

experimental group achieved improvements in their total POPS score. The males in
109
the experimental group improved 20% more than their male counterparts in the

control group.

Hands-on Problem Solving Test

The hands-on problem solving test was designed to examine the difference of

computer troubleshooting on with a hands-on problem solving test.

Figure 4.14

Experimental Males versus Control Males Time to Complete the Easy Problem in the
Hands-On Pre-Test and Post-Test

Experimental Males vs. Control Males Time to Complete t e


Easy Problem in the Hands-On Pre-Test vs. Post-Test

3.25 -��..,..,.,.,,,.111111!��
Time (Mi ut,s) --+- Control Males Time
2.75 +----1..---J.....,_�-----"---------'"l to Complete the Easy
Problem in the
Hands-on Test
--- Experimental Males
Time to Complete the
Easy Problem in the
Hands-on Test

Pre-Test Time Post-Test Time

Males in the experimental group improved their time at a greater interval than

their counterparts in the control group. Although both groups improved in time, the

control males improved an average of approximately six seconds between the pre-test

and post-test. Whereas the experimental males improved an average of thirty seconds

between their pre-test and post-test.


110
Figure 4.15

Experimental Females versus Control Females Time to Complete the Easy Problem
in the Hands-On Pre-Test and Post-Test

Experimental Females vs. Control Females Time to


Complete the Easy Hands-on Pre-Test vs. Post-Test

2
1.8
1.6 � Control Females lime
: 1.4 - to Complete the Easy
Problem in the Hands­
] 1.2
on Test
:i 1
-a- Experimental Females
E o.8
i= 0.6
lime to Complete the
Easy Problem in the
0.4 Hands-on
0.2
0
Pre-Test lime Post-Test lime

However, the females in the experimental group did not improve as much as

the females in the control group. The control group improved almost thirty seconds,

whereas the experimental females only improved twelve seconds. This shows that

the comparison of time is inconclusive and with small numbers, such fluctuation of

data shows the females in the experimental group received no additional growth in

hands-on problem solving from the computer troubleshooting training.

Time was not the only measure of improvement in the hands-on problem

solving test. The number of attempts was an additional measure of assessment used.
111
Figure 4.16

Difference in Number of Attempts Between the Easy Problem in the Hands-On Pre­
Test and Post-Tests Separated by Groups and Gender

Difference In Numbers of Attempts between Pre-Tests and Post-Tests Separated by Groups


and Gender

--+ Female Control Attempts


--- Female Experimental Attempts
Male Control Attempts
--+-<Male Experimental Attempts

Easy Problem Pre-Test Attempts Easy Problem Post-Test Attempts

Figure 4.17

Difference in Number of Attempts Between the Difficult Problem in the Hands-On


Pre-Test and Post-Tests Separated by Groups and Gender

Difference in Number of Attempts between Difficult Pre-Tests and Post-Tests Separated by Groups and
Genders

--+ Female Control Attempts


-a Female Experimental Attempts
"'"'"�k,_w, Male Control Attempts
� Male Experimental Attempts

Difficult Problem Pre-Test Attempts Difficult Problem Post-Test Attempts


112
All groups decreased in their attempts between the pre-test and the post-

test, except the male experimental group. In both the easy problem and the difficult

problem, the male students in the experimental group were the only students to

increase their attempts in either problem. All other groups decreased their attempts

or remained constant in their attempts during both problems.

Overall, the data resulting from the hands-on problem solving test shows the

experimental males improving more than their male peers in the control group and the

females in both groups. The male experimental group also used more attempts to

solve the difficult problem than the males in the control group. Overall, the most

improvement in time completion on the easy problem was shown by the female

control group, improving by more than 30 seconds on average. The female groups

also saw a large decrease in number of attempts through both problems. The hands­

on problem solving test was not substantial evidence to prove the theory either way.

Using all forms of assessment, POPS produced the most constant and obvious

results. The male results were much less stable than their female counterparts.

However, using the POPS tests, gender was analyzed and resulted with two main

findings. Females scored higher on the pre-tests and post-tests than the males,.

however, males in the experimental group improved on a greater interval than their

female counterparts.

Hypothesis 6

Students rated as high problem solving ability by their teachers will improve

their problem solving ability within the experimental group as opposed to the control

group.
113
Students in the experimental group and control group were matched by

how their teachers rated their problem solving ability. When students were divided

into teacher-rated problem solving groups, the results presented a different angle on

improvements. Overall, students in the low problem solving group showed little

increase in scores, and in numerous cases, decreased their score in the post­

assessment.

Students in the different teacher-rated problem solving ability levels were

analyzed using data from the POPS test and the Hands-on problem solving test to

observe whether there was any noticeable difference between groups.

POPS -Profiles of Problem Solving

The POPS test was the first form of assessment used to see whether there was

any difference between the teacher-rated problem solving ability groups. The total

POPS pre-test score and post-test score was analyzed, as well as each individual

category of the POPS test.

The results were divided by the teacher-rated problem solving ability level

and provided interesting results. The researcher found the low problem solving

ability group often decreased their scores. The two students rated as low problem

solvers both rushed through all post-assessments and could be a possibility for the

decrease. In the figure above (Figure 4.18), the graph shows the average increase of

the high and medium experimental group, which improves at a greater interval than

the average of the high and medium control group. The experimental group also

started at a lower average score than the control group.


114
Figure 4.18

POPS Total Score Pre-Test and Post-Test Divided by Teacher Rated Problem Solving
Ability and Groups

POPS Total Score Pre-Test vs Post-Test


Divided by Teacher Rated
Problem Solving Ability
and Groups

31
2a
-·����;:�::=:�r==��;;i;;::;:�;�q
w�:.:.::1---����2-----i
25 ��di���-___,,,,:;:__..,...._

-r -__:!:: � "'--�;;;;::: =:r.; �----1


22
19-·1-----,-----,i
16
13 +----------,----......... .%.\
POPS Total Score Pre- POPS Total Score Post-
Test Test

Figure 4.19

POPS Total Score Pre-Test and Post-Test High Teacher Rated Problem Solving
Ability Separated by Groups

POPS Total Score Pre-Test vs. Post-Test High


Teacher Rated Problem Solving Ability Separated by
Groups

-
34.8
35
r:: 33 - �High Problem Solving

-
0 31 Ability Control Group
iii 29
27 - -High Problem Solving
25 Ability Experimental
POPS Total Score POPS Total Score Group----�
� -
Pre-Test Post-Test
115
By analyzing group by group, the experimental group showed more results

than the control group. The high problem solving ability control group increase at a

very minimal level. The experimental high problem solving ability group increased

at a much more dramatic rate.

Figure 4.20

POPS Total Score Pre-Test and Post-Test Medium Teacher Rated Problem Solving
Ability Separated by Groups

POPS Total Score Pre-Test vs. Post-Test Medium


Teacher Rated Problem Solving Ability
Separated by Groups

36 --,,_,,,.-,,,,,,=---------�
;,,,<,!
34 +-.......,{-...--------::a__,
c
rn
32 -.-Medium Problem
o 30 Sol.,,;ng Ability
a. 28
ns 26 Control Group
o
t-
24
22
--- Medium Problem
20 Sol.,,;ng Ability
POPS Total Score POPS Total Score Experimental Group
Pre-Test Post-Test

The difference between the control and experimental medium problem solving

ability subjects was not as extreme as the high problem solving ability groups.

However, the experimental medium problem solving subject improved by nine points,

or 30%, while the control subject improved by six points, or only 18%.
116
Figure 4.21

POPS Total Score Pre-Test and Post-Test Low Teacher Rated Problem Solving
Ability Separated by Groups

POP Total Score Pre-Test vs. Post-Test Low Teacher Rated Problem Solving Ability
Separated by Groups

16

-Experimental Low Problem Solving Student


-Control Low Problem Solving Student
14 �.,.,.,.,..,........______.......,,�__....._____...________..""'44

POPS: Total: Pre POPS: Total: Post

The low problem solving ability group results also presented better results in

the experimental subject as opposed to the control subject. The low problem solving

experimental subject decreased from a total pre-test score of 18 to a total post-test

score of 17, while the control student's score decreased from a 17 to a total post-test

score of 11. Therefore, the experimental student did not produce as large of a drop as

the control student in the low teacher rated problem solving ability group.

Students were separated into different problem solving ability groups for

comparison purposes, to create equality among the sample groups. However, when

analyzed data showed that the high and medium experimental group improved over
117
the high and medium experimental group. The low experimental group also

produced more improvements on scores than the low control subject; however, since

the data was collected from only 2 students, the results are unreliable.

Hands-on Problem Solving Test

When separated by problem solving ability level, the data results in the hands­

on problem solving test remained similar to the overall results of the study. Students

in all groups improved their time to complete the easy problem at nearly the exact

same rate (See Figure 4.22).

Figure 4.22

Time Completion Compared by Problem Solving Ability for the Easy Problem in the
Hands-On Problem Solving Pre-Test versus Post-Test

Time Completion Compared by Problem Solving Ability for


the Easy Problem in the Hands-On Problem Solving Test

E
Q) 3.5
0
a. 3
co - 2.5 lgh E)<perimental
w Q)
"'
Q) - roup
- :::s 2 ITT

Medium Group
a.·-
-
Q) C

E� 1.5
0

-
u Low Group
0 1
Q)

0.5
Average Time to Solve Average Time to Solve
Pre-Test Post-Test
118
Table 4.26

Time Completion Compared by Problem Solving Ability for the Easy Problem in the
Hands-On Problem Solving Pre-Test versus Post-Test

Group Average Time to Average Time to Average Improvement


Solve Pre-Test Solve Post-Test Between Tests
High Experimental Group 1.31 0.94 0.37
High Control Group 3.25 2.94 0.31
Medium Group 1.38 1.0 0.38
Low Group 1.25 0.88 0.38

All groups improved approximately 20 seconds, on average, therefore, time

improvement was constant between all groups. However, the number of attempts

students used to solve the problems varied greatly. Students in the high experimental

group were the only subjects to increase their average number of attempts from the

pre-test to the post-test, while still matching the time improvement rate of the other

groups.

All other groups, besides the high experimental group, decreased their number

of attempts while maintaining a similar time improvement of approximately 20

seconds. The high experimental group also maintained the same time improvement

of approximately 20 seconds, while increasing their attempts. They averaged .75

more attempts in their post-test than in their pre-test. The high control group,

medium group and low group all decreased in their number of attempts by an average

of .5 attempts or more.
119
Figure 4.23

Number of Attempts on the Easy Problem in the Hands-on Pre-Test versus Post

Number of Attempts on the Easy Problem Hands-On Test Comparing Pre-Test vs. Post-Test
Scores Across Teacher Rated Problem Solving Ability Groups

3.5 -

� Medium Group
i 2-i-------------�.,--
7�---------11::;:�Lo�w�G�ro�upe____ __
_j

z
a,

E' 1.5 -1--:brr---;---;;-:--=--:---___;=;lliil""'--::.::


a,

Easy Problem Pre-Test Easy Problem Post-Test

Table 4.27

Percentage of Teacher-Rated Problem Solving Ability Grouped Students Able to


Solve the Difficult Problem

Teacher-Rated Problem Solving Difficult Problem Pre-Test Difficult Problem Post-Test


Ability Group
High Experimental 25% 50%
(4 students)
High Control 25% 25%
(4 students)
Medium 50% 100%
(2 students)
Low 0% 50%
(2 students) 1 OO<fa erimentaD
120
The students in all groups had trouble with the difficult problem in the

hands-on problem solving test. The percentage of students within their teacher-rated

problem solving ability groups who solved the difficult problem in the hands-on pre­

test versus the post-test is shown above (See Table 4.27). There was no large

improvement from any groups because the groups contained such small numbers.

However, students in the medium and low problem solving groups had more success

with solving the hands-on problem solving test. While only 38% of all the high

problem solving ability students solved the difficult problem in less than five minutes

during the post-test, 75% of the students in the low and medium problem solving

ability groups solved the difficult problem. The low and medium problem solving

ability groups could have benefited from the hands-on manipulation of solving the

problem.

When separated by problem solving ability level, the data results remained

similar to the overall results of the study. Overall, the students in the experimental

groups outperformed the students in the control groups at all levels of problem

solving ability. The experimental students rated high and medium showed more

improvement than the expenmental student in the low group.

Hypothesis 7

Computer troubleshooting will have an effect on elementary students' general

problem solving skills.

The results from the numerous assessments show the experimental group

improving more than the control group. Information was taken from the POPS test,

the hands-on problem solving test and the group interview in order to evaluate
121
whether students in the experimental group achieved higher results in the post

assessment than the students in the control group.

POPS -Profiles of Problem Solving Test

The POPS test, comprised of separate categories, presented information on the

improvement of each group. The categories which were the most focused on in

problem solving were the methods used section and extracting information. Although

all categories are useful in problem solving, and the total score was analyzed,

correlation was drawn between the two specified categories.

Figure 4.24

Difference in Total POPS Score Between the Pre-Test and Post-Test of Average
Experimental Group versus Average Control Group

Difference in Total POPS Score Between the Pre-Test and Post-Test of


Average Experimental Group versus Average Control Group

=
•I
Ill
30 +-----------------'---------'-!
Ill
� 29F-=----------.;......-�<.------�-�
---Average ExperimentalGroup
-AverageControlGroup

•�
� 2e-i--,..,....,..__________,______._..,_____

: 27 +-----------------------

POPS Pre-Test Total Score POPS Post-Test Total Score

As seen in the information provided above (See Figure 4.28), the total POPS

score, which analyzed the overall improvement of the students' ability to solve
122
problems, improved more in the experimental group, on average, as opposed to

the control group's average score. Both groups began relatively at the same level.

The control group started out with an average pre-test score of 25.5, while the

experimental group started out with an average pre-test score of 25.333. However,

the improvements did not remain constant between the groups. The average

experimental post-test score reached 31 points, improving an average of 5.67 points

per student. The average control group post-test score reached 25.83 points,

improving an average of .3 points per student. Therefore, the experimental group was

able to improve an average of 5.3 points per student more than the control group

between their POPS pre-test and post-test total score.

The individual categories were also analyzed to show whether the

experimental students improved at a greater rate than the control students. The most

important categories to the project were the methods used and extracting information.

The accuracy, correctness of answer and quality of explanation were less important to

the researcher and were not the focus of the project. The first category analyzed was

the methods used category. The methods used category was analyzed earlier in

hypothesis two, showing the large difference between the experimental group's

average improvement and the control group's average improvement between pre-test

and post-test (See Figure 4.1 and 4.2). The students in the experimental group

improved an average of 2.5 points by increasing their average pre-test score of 5.0

points to an average post-test score of 7.5 points. The students in the control group

actually dropped their average score 0.17 points, descending from an average pre-test

score of 6.0 points to an average post-test score of 5.83 points. Therefore, the
123
experimental group showed an average improvement of approximately 33%,

while the control group decreased an average of approximately 3%.

The extracting information category was another focal point of the project,

because finding important information was indicated as one of the most difficult

processes within solving a problem by the subjects. The extracting information

results were not as evident as the methods used results, however, slight differences

between groups were still present.

Figure 4.25

Comparing the Difference in POPS Extracting Information Section Pre-Test and Post­
Test Between Groups

Comparing the Difference in POPS Extracting Information Section Pre-Test and Post-Test
Between Groups

-Average Experimental Group


-Average Control Group

4.5
Extracting Information Pre-Test Extracting Information Post-Test

Students in the experimental group achieved an average pre-test score of 5 .83

and an average post-test score of 6.67, improving an average of .84 points. Students
124
in the control group achieved an average pre-test score of 4.67 and an average

post-test score of 5.33, improving an average of .66 points. Therefore, there was little

difference between groups, but due to the small score, the average scores within the

experimental group improved more than the control group.

The other categories were also compared between groups and the results are

shown below (See Table 4.28)

Table 4.28

POPS Correctness of Answer, Accuracy and Quality of Explanation Categories Pre-


Test and Post-Test Scores Compared Between Ability Groups

Test Category Group Pre-Test Post-Test Difference in Percentage


Score Score Pre-Test Score Difference in
vs. Post-Test Pre-Test Score
Score vs. Post Test
Score
Correctness of
Answer
Control Group 4.83 6.33 1.50 24%
Experimental 5.67 6.50 0.83 13%
Group
Accuracy
Control Group 5.83 5.67 -0.16 -3%
Experimental 5.50 6.33 0.83 13%
Group
Quality of
Explanation
Control Group 4.17 2.67 -1.50 -35%
Experimental 3.33 4.00 0.67 17%
Group

All three remaining categories were not as important to the project as the

methods used and extracting information. The correctness of answer and accuracy

categories were more mathematically centered than the other categories, and the
125
quality of explanation was based on the student's ability to explain their answer,

and without specific training, it is difficult for students to improve in this category. In

the accuracy and quality of explanation categories, experimental students improved

their scores on the post-test, while students in the control group actually showed a

decrease in scores on the post-test. While both group improved their scores on the

correctness of answers category, students in the control group improved their scores

by 24%, while the experimental group only improved 13%.

Overall, students in the experimental group showed greater improvement on

the; total POPS problem solving test score, methods used score, extracting

information score, accuracy score and quality of explanation score than the students

in the control group. The results obtained from the POPS test show the experimental

group was able to produce more improvements in general problem solving than the

control group.

Hands-on Problem Solving Test

The hands-on problem solving test did not offer many results, showing similar

improvements for both groups. The easy problem was solved by all students in both

groups during the pre-test and post-test.

Both groups were almost identical in their improvement in time. As indicated

in hypothesis two, students achieved similar results. Overall, the hands-on problem

solving test provided very similar results in all aspects (See Table 4.31).
126
Table 4.29

Results from Easy Problem in Hands-on Problem Solving Test Comparing between
Groups

Group Number of Number of Attempts Time to Complete Time to Complete


Attempts Pre-Test Post-Test Pre-Test Post-Test
Control Group 2.67 1.3 1.38 1
Average
Experimental 2.17 2.5 2.54 2.21
Group Average
Figure 4.26

Comparing Completion of the Easy Hands-on Problem Time Improvements Between


Problem Solving Ability Levels

Comparing Time Improvements to Complete the Easy Hands-on Problem from Pre-Test to
Post-Test of the Control Group versus the Experimental Group

2
1 .5

I ---------
2
� �----.I
_.,_ Average Control Group
.11
-a-Average Experimental Group

i
:

15
ll

� 1.i::-,�:...;...;_..,.;ii!!.-;._.;_..___�/;!;_--..;;;:;:��::L,,-..:.::.....:.;;'!

Time to Complete Easy Problem Pre-Test Time to Complete Easy Problem Post-Test

Although a small sample group was used, differences arose between the

experimental group and control group pertaining to solving the difficult problem in

the hands-on problem solving test. During the pre-test, only 33.3% of the

experimental group solved the difficult problem, while 66.7% solved the difficult

problem in the post-test. The control group produced less dramatic results of 16.7%

of the group solving the difficult problem in the pre-test and 33.3% of the students

solving the problem in the post-test. Therefore, more students in the experimental
127
group solved the problem, showing a great improvement in their score than the

control group.

Figure 4.27

Comparing the Percentage of Students in the Control Group versus the Experimental
Group Able to Solve the Difficult Problem in the Pre-Test and Post-Test

Percentage of Students who were able to solve the difficult problem In the pre-test and post­
test comparing the control group versus the experimental group average

100.0%

90.0%

�e
CL
10.0%

.5 60.0% � Percentage of Control Group Students who


f were able to solve the difficult problem
� -Percentage of Experimental Group
Students who were able to solve the difficult
40.0% problem

J 30.0%

20.0%

10.0%

0.0%
H02: Pre: Solution H02: Post: Solution

The number of attempts was the last section of the hand-on problem solving

test to analyze and again the experimental group increased the number of attempts in

the easy problem and the difficult problem from the pre-test to the post-test (See

Table 4.32).

The experimental group increased from an average of 2.17 attempts in the pre­

test to an average of 2.5 attempts in the post-test on the easy problem in the hands-on

problem solving test. The control group decreased from an average of 2.67 attempts

in the pre-test to an average of 1.33 attempts in the post-test on the easy problem in

the hands-on problem solving test.


128
Table 4.30

Average Number of Attempts in Control Group versus Experimental Group and


Percentage of Each Group Able to Correctly Solve the Difficult Problem

Group Average Average Percentage of Percentage of

Number of Number of Group to Group to

Attempts Pre- Attempts Correctly Solve Correctly Solve

Test Post-Test during Pre-Test during Post-Test

Control 4.83 2.67 16.7% 33.3%

Experimental 2.17 2.83 33.3% 66.7%

The experimental group increased from an average of 2.17 attempts in the pre­

test to an average of 2.83 attempts in the post-test on the difficult problem in the

hands-on problem solving test while the control group decreased from an average of

4.83 attempts in the pre-test to an average of 2.67 attempts in the post-test on the

difficult problem in the hands-on problem solving test.

Overall, the experimental group increased their attempts during the post-test,

while decreasing the amount of time to complete the test comparable to the control.

The control group used less attempts while decreasing the amount of time taken to

complete the problem comparable to the experimental. The most dramatic results

were shown in the percentage of students who completed the difficult problem during

the post-test, as compared to the pre-test. A higher percentage of students in the

experimental group solved the difficult problem during the post-test than the students

in the control group.


129
Group Interview

The group interview was conducted after the training session, on the last day

of post-testing. All students from the experimental group and control group

participated in the group interview. The researcher first had the students write

responses to the questions on a sheet of paper, and then verbally discussed each

question as a group. One of the questions presented to the students during the group

interview was "Do you think learning how to troubleshoot a computer helped you

with your problem solving skills? Why or why not?" Students were first asked to

respond to the question with a written reaction, and then they were asked to verbalize

any additional answers as a group. All the students in the experimental group chose

to respond in some detailed fashion and wrote the following responses:

Student #2 in the experimental group wrote: "Yes. Because you had to figure out

what the problem is and you have to think of the solution of the problem."

Student #4 in the experimental group wrote: "Yes Because you learned how to fix

things [easier] when we did the computer."

Student #6 in the experimental group wrote: "I think it helped by learning strategies

like in math."

Student #8 in the experimental group wrote: "Yes I think troubleshooting will make a

difference because finding out [what's] wrong with a computer is a lot like finding

out what the solution is in a question."


130
Student #8 in the experimental group wrote: "Yes I think troubleshooting will

make a difference because finding out [what's] wrong with a computer is a lot like

finding out what the solution is in a question."

Student #10 in the experimental group wrote: "yes,"

Student #12 in the experimental group wrote: "No not really because I was just

learning [to] put [together] and take apart [a computer]."

Five out of six students in the experimental group felt the computer

troubleshooting training made a difference in their problem solving ability. Students

made references to "figure out" problems, "fix things" and "finding out" information,

which are key elements and steps in the problem solving process. They also

mentioned "finding out what the solution is" and one student even compared the

troubleshooting activity to "learning strategies like in math." The student, who

believed the computer troubleshooting sessions had no effect on problem solving

skills, was teacher-rated as a low problem solving abilities student, who may still be

operating at the concrete level of understanding, not the abstract level.

Students were also encouraged to expand on their writing by verbally

discussing the question. The excerpt from the transcribed conversation (Table 4.33)

showed how students verbally responded to one of the questions in the written

interview "Do you think learning how to troubleshoot a computer helped you with

your problem solving skills? Why or why not?" The researcher began the question

discussion by prompting the students with the question and asked their opinion.
131
Students believed that the computer troubleshooting sessions had an effect

on their problem solving skills. They drew references to the similarities between

problem solving and computer troubleshooting activities. Students made references

to having to "figure out" problems and "finding out what the solution is" comparing

the solving of computer problems to the solving of general problems. Overall, the

students concluded similarities and concluded that the computer troubleshooting

training had advantages for general problem solving.

Table 4.31

Excerpt from Group Interview - Transcribed Conversation

Individual Direct Quotes from Individual

Speaking

#7 (control) No.

RESEARCHER Why do you think that?

#7 (control) Because you were just taking apart a computer it wouldn't

really help problem solving.

#4 (experimental) I think yes because we actually learned like, cause you didn't

help us that much. You just kind of took apart the computer

and we had to think of all the parts that were missing and stuff.

#8 (experimental) I had to figure out what was wrong with the computer. And it

was a lot like trying to figure out the problem.

#2 (experimental) Yes because we had to figure out what the problem was.

#12 (experimental) No not really because it was just taking apart the computer.

#6 ( experimental) I think that it would help with like strategies and stuff because
132
like we had to use different strategies.

RESEARCHER OK so different strategies that you had to use. Why don't you

guys try to tell them a little bit about what we did?

#4 (experimental) The first couple days we were just studying like what parts of

the computers were the computers. We took apart the

computers and um... and put them back together. Then like the

last day, she took apart a computer and we had to put it back

together with all the parts.

#6 ( experimental) We had to go to like stations and we had to figure out what it

was and fix it.

RESEARCHER You had to fix it. You had to figure out what it was first and

then you had to fix it. So you had to identify the problem.

#2 (experimental) We had to install and uninstall a program.

RESEARCHER Did the people who went through the computer troubleshooting,

did you guys have fun doing that?

All Yeah.

RESEARCHER Do you think that if your teacher had a problem with the

computer that you could fix it?

All Yeah.

RESEARCHER So now you can help your teacher out in lab?

#4 (experimental) We did it in like art. The printer wasn't working; it was the

same problem here. So we pressed the button and it worked.

RESEARCHER Alright there you go. That's fantastic guys! Thanks a lot. I
I really appreciate it.
133

I
CHAPTERV

CONCLUSIONS AND DISCUSSION

Introduction

Computer troubleshooting has the possibility of enhancing problem solving

learning experiences within the elementary curriculum. Computer troubleshooting

training can also prepare students to assist in computer labs. The similar processes in

computer troubleshooting and problem solving involve; identifying the problem,

devising a solution and fixing the problem successfully. The researcher believes

there is a strong relationship between developing computer troubleshooting skills and

general problem solving skills.

The computer troubleshooting process also provides students with immediate

feedback on the successful resolution of technical problems. The researcher believes

the most difficult part of the problem solving process for elementary students is

"understanding the question" or the problem. Computer troubleshooting assists in this

development because the problem is straightforward, allowing students to easily

identify the problem. Based on the evidence found in this study, elementary students

are capable of learning how to troubleshoot common computer problems. Pending

further research, the researcher concludes that learning how to troubleshoot a

computer has the potential to improve problem solving ability in elementary students.

134
135
Summary of the Study

Summary of the Research Problem

Students currently receive insufficient problem solving learning opportunities

(Coleman, et al, 2001; Jonassen, 2000). Problem solving skills are essential for a

student's future. Providing students with the skills to solve problems as opposed to

merely supplying them with content knowledge enables the students to transfer the

content knowledge to various situations requiring problem solving (Casey & Tucker,

1994). The research project explored the problem solving requirements necessary in

computer repair and troubleshooting, and their effect on the academic achievement

and academic problem solving of elementary students. Computer repair methods and

troubleshooting techniques were used as models for teaching problem solving

strategies. The study proposed to increase problem solving abilities and academic

achievement among elementary students through the computer troubleshooting

technology curriculum. With the findings from this research study, schools could

incorporate the computer repair and troubleshooting training program into the

curriculum. This could serve to enhance the problem solving learning experience and

the teaching of technology skills.

Summary of the Methods

The research project was designed with a control/experimental pre/post test

design. The whole study lasted two school weeks, including all testing periods. The

purpose of the project was to establish whether computer troubleshooting had an


136
effect on problem solving skills. Each group included three boys and three girls,

matched according to their problem solving ability levels as assessed by their

teachers. Both groups received pre-assessment including two standardized tests, a

hands-on problem solving test, and a survey collecting information from the students

on problem solving skills, attitudes and math abilities. Following the two days of pre­

testing, the experimental group attended computer troubleshooting training sessions,

which were held for forty-five minutes in the morning before school over the course

of five days. The control group received no training. Following the training, the

students from the control and experimental groups were post-tested. The post­

assessment included two standardized tests, the hands-on problem solving test and a

group interview modeled from the survey. The pre-tests and post-tests were

compared through statistical analysis, graphs and tables, as well as focusing on each

individual student's growth in a case study approach.

Summary of the Findings

Hypothesis 1

The researcher used the findings from the troubleshooting activity as a means

to assess whether the subjects could successfully troubleshoot common computer

problems. The computer troubleshooting activity was an accurate simulation of

common computer problems encountered in schools on a daily basis. The

troubleshooting activity consisted of six interactive stations, each presenting a

different common computer problem for the students to assess and solve. The
137
students worked in teams of two and solved all six problems, with the exception of

one team who, due to time constraints, was only able to solve five of the six

problems. Students recorded their answers on team worksheets, describing the

problem and the measures they used to fix the problem. Through the computer

troubleshooting activity, students demonstrated their computer troubleshooting ability

by; reattaching power cords and peripheral devices, installing software programs, and

physically installing ram and other hardware components. Overall, students

identified the problem, devised a solution and fixed the problem.

The findings from the group interview were used to assess whether the

subjects could successfully troubleshoot common computer problems. During the

group interview, students in the experimental group explained the characteristics and

requirements of the computer training sessions to the students in the control group.

Students discussed taking apart computers, fixing the problems and installing

programs. When asked if they would be able to assist their teacher with a computer

problem, the students responded in unison, "Yeah."

Findings from these two methods of assessment would suggest that students

can learn to solve computer problems.

Hypothesis 2

The researcher used findings from the Methods Used section of the POPS test

as a means to assess problem solving methods. Within the Methods Used section,

most control students either demonstrated consistency or decreased in their score


138
between the pre-test and the post-test. Only one student in the control group

improved their score by one point, and the average score for the group decreased

0.167 points between the pre-test and post-test. The students in the experimental

group all improved by one point or more, creating an average improvement of 2.5

points between the pre-test and the post-test.

The second set of findings used to analyze whether students in the

experimental group improved their problem solving methods was the hands-on

problem solving test. The average number of attempts for the easy problem and the

difficult problem was analyzed for each group. The experimental group used more

attempts on average in the post-test than in the pre-test, while the control group

achieved opposite results. The time necessary to complete the easy problem was also

compared using average pre-test and post-test times of both groups. The

improvement time for the control group and the experimental group were very

similar, varying by only 2.8 seconds. The last set of findings in the hands-on problem

solving test analyzed for the second hypothesis, were the number of students in each

group able to solve the difficult problem. Four students in the experimental group

solved the difficult problem correctly, while only two students in the control group

were able to solve the problem correctly.

Findings from the Methods Used section of the POPS would suggest that

students could improve their problems solving methods by learning to troubleshoot

and repair computers. Findings from the hands-on problem solving test were found to

be minimal and insignificant.


139
Hypothesis 3

The researcher assessed the most difficult procedure in problem solving for

elementary students by using findings from the Profiles of Problem Solving test.

Since the POPS test assessed students on different elements of problem solving, each

category of the problem solving process could be separately analyzed by observing

pre-test scores of the students. The most difficult procedures of problem solving, in

order of difficulty were; correctness of answer, extracting information and methods

used.

The second source of findings used to discover the most difficult problem

solving procedure was the survey/group interview. When asked what the most

difficult part of solving a problem out of five procedures, eight out of the twelve

students responded with "understanding the question" was the most commonly

mentioned procedure. Other procedures mentioned were; identifying important

information, looking back, lack of information and the method/strategy to use.

Findings from the POPS test would suggest that students have the most

difficulty with; correctness of answer, extracting information and methods used in the

problem solving process. Findings from the group interview would suggest that

"understanding the question" is the most difficult process in solving a problem.

Hypothesis 4

Information was collected from the IOWA Basic Skills math problem solving

and data interpretation test and the group interview, in order to analyze whether
140
mathematical ability would be affected by the training sessions. The IOWA test

results indicated that the average scores of the control group and the experimental

group were identical and produced no significant results.

The group interview provided additional findings. Students made references

to figuring out the problem and finding a solution to the problem. One student also

made the comparison of learning strategies in computer troubleshooting to learning

strategies in math.

Little evidence was found to support the improvement of math skills within

either form of assessment.

Hypothesis 5

Gender differences were evaluated through the Profiles of Problem Solving

test and the hands-on problem solving test. The POPS pre-test and post-test total

scores were compared between genders, as well as each category of the POPS test.

The females improved from a mean total score of 26.5 to 32.5, while the males

retained a constant total mean score of 24.3 between the pre-test and the post-test.

When separated by gender and group, the females in the experimental group

improved by an average of 22% between the pre-test and the post-test, while the

females in the control group improved by an average of 14%. The males in the

experimental group improved by an average of 12%, while the males in the control

group decreased their score by an average of 11%. Throughout the POPS categories,
141
the females outperformed the males in overall scores and in the improvement in

scores between the pre-test and the post-test.

Additional results were collected from the hands-on problem solving test. The

researcher compared the completion time for the easy problem from the pre-test to the

post-test. The males in the experimental group improved their completion time of the

easy problem by a greater percentage than the males in the control group. However,

the females in the control group demonstrated greater improvement than the females

in the experimental group in the completion time of the easy problem.

The findings for the number of attempts to solve each problem was also

studied from the hands-on problem solving test to analyze the affects of gender on the

results of the study. The males in the experimental group were the only group to have

increased their average number of attempts in the post-test as compared to their pre­

test average attempts.

The findings from the Profiles of Problem Solving test would suggest that

there is little or no difference between gender-based results. Findings from the hands­

on problem solving test presented conflicting data, the female control group and the

male experimental group were able to improve in multiple areas. The findings were

inconclusive.

Hypothesis 6

Students classified by teachers as having high, medium and low problem

solving abilities were compared to evaluate if the variable was a significant factor.
142
The students in the high and medium experimental group improved their total POPS

score by an average of eight points, while the high and medium control group

improved their total POPS score by an average of three points; The low problem

solving group decreased their average total POPS score from the pre-test to the post­

test. When the ability groups were analyzed by treatment group, the experimental

group's improvement was more significant between the pre-test and post-test than the

control group.

The hands-on problem solving test exhibited continuity across the ability

problem solving groups. Each level of high, medium and low students demonstrated

a similar improvement in time completion of the easy problem. High ability level

students in the experimental group were the only students to increase their average

number of attempts between the pre-test and the post-test. The number of students

able to solve the difficult problem was also analyzed, but results were inconclusive

due to small sample numbers.

When comparisons of the high, medium and low problem solving ability

students were made, the findings from the POPS test would suggest that regardless of

problem solving ability levels, students can improve their problems solving methods

by learning to troubleshoot and repair computers. The findings also indicated that the

high and medium ability level students in the experimental group were able to

improve their scores more than the low ability student. Findings from the hands-on

problem solving test were found to be minimal and insignificant.


143
Hypothesis 7

Information from the Profiles of Problem Solving test, the hands-on problem

solving test and the survey/interview were all used to investigate the final hypothesis.

The mean total POPS score was averaged for each group, comparing the pre-test total

score to the post-test total score. The findings demonstrated a 25% increase for the

experimental group, improving from an average score of 25.3 points to an average

post-test score of 31 points. All individual sections of the POPS test favored the

experimental group by demonstrating improvements of 20%, except the correctness

of answer section, which only demonstrated a 13% increase.

The experimental and control group demonstrated the same improvement in

time completion of the easy problem within the hands-on problem solving test. The

experimental group increased their average number of attempts in both problems by

20% from the pre-test tot the post-test, while the control group achieved opposite

results, decreasing in the average number of attempts by 5%. 33% of the control

group was able to solve the difficult problem, while 66% of the experimental group

was able to solve the difficult problem.

The last collection of data analyzed to address the hypothesis was the final

group interview. When students were asked if they felt computer troubleshooting

could make an impact on problem solving skills, five out of the six students in the

experimental group replied that computer troubleshooting did make a difference.

References to figuring out problems and finding solutions were some of the additional

comments given by students concerning the relationship between the two processes.
144
The findings from the Profiles of Problem Solving test would suggest that

students could improve their problems solving skills by learning to troubleshoot and

repair computers. Findings from the hands-on problem solving test were found to be

minimal and insignificant. The findings from the group interview would suggest that

students believed problems solving skills would · improve from learning to

troubleshoot and repair computers.

Conclusions

Hypothesis 1

Elementary students who participate in a computer troubleshooting curriculum

will develop the ability to solve common computer problems by participating in

computer troubleshooting trainings.

The computer training sessions included overviews of computer hardware

components and repair. The training sessions included; the installation of programs,

physical installation of components, and troubleshooting the operating system and

other common computer problems. Once the students received all · the training

sessions, the instructor conducted a review discussion of different common computer

problems; identifying why the problem exists and how to fix the problem. The

computer troubleshooting activity was designed to emulate common computer

problems, presenting a different problem at each of the six stations. The teams

needed to study and solve the problem at each station, while recording their answers

on the team worksheet. All three teams were able to solve all six problems, with the
145
exception of one team who, due to time constraints, was only able to attempt five of

the six stations. The elementary students exhibited their troubleshooting ability by

successfully reattaching power cords and peripheral devices, installing software

programs and physically installing internal hardware components. At every station

attempted, students identified the problem, devised a s·olution and fixed the problem

successfully without guidance or assistance from the instructor.

The group interview was also used to assess whether the elementary students

could successfully troubleshoot common computer problems. During the group

interview, students in the experimental group made reference to assisting a teacher by

troubleshooting a common computer problem. "We [solved a common computer

problem] in like art. The printer wasn't working; it was the same problem [we had

encountered in the final computer troubleshooting activity]." The students in the

experimental group stated that they had gained the ability to take computers apart, fix

the problems and install programs.

Based on observations from the troubleshooting activity, students solved all

attempted common computer problems. The group interview confirmed that students

were comfortable with their new ability to solve computer problems, the researcher

believes elementary students can develop knowledge through computer

troubleshooting in order to solve common computer problems. When the students

were asked if they could assist their teacher with a computer problem in the future,

the students responded in unison, "Yeah."


146
Hypothesis 2

Elementary students who participate in the computer troubleshooting

curriculum will improve problem solving methods.

The Methods Used section of the POPS test and the hands-on problem solving

test was used in order to assess whether the experimental students improved their

problem solving methods. Most control students' scores either remained constant or

decreased in their score between the pre-test and the post-test results, while all the

experimental group students improved by one point or more, creating an average

improvement of 2.5 points or 33%, between the pre-test and the post-test results. The

instructor concludes that the experimental group demonstrated a significant

improvement in the test results after the training sessions. The computer trainings

sessions made a difference in the problem solving methods used by students in the

experimental group.

Overall, the hands-on problem solving test was not a significant factor

towards improving the problem solving methods. The number of attempts varied

widely between the groups. The experimental group demonstrated an increase in the

average number of attempts between the pre-tests and post-tests, while the control

group showed a sign ificant decrease in the average number of attempts. On average,

the control group improved by completing the easy problem in less time as compared

the experimental group, but the results were minimal and inconclusive. Four of the

students in the experimental group solved the difficult problem in the post-test, while

only two students in the control group solved the difficult problem in the post-test.
147
However, these results were also minimal, and due to the small number of student,

the data is considered insignificant. The researcher believes the findings from the

hands-on problem solving test were minimal and do not support the hyp othesis.

Hypothesis 3

The most difficult procedure in problem solving for elementary students will

be to understand what the question is looking for.

In the survey/group interview, students were asked to indicate the most

difficult part of solving a problem. Eight out of the twelve students responded with

the category "understanding the question". The students classified under this

category used key phrases such as; "I don't understand [the question]", "I don't know

what the question is asking you to do", and "what the problem is looking for". Based

on these responses, students had difficultly determining what they needed to use the

information for or exactly what the question was asking.

The POPS test evaluated each individual student on separate processes of

problem solving. Each component of the problem solving process was analyzed by

observing the pre-test scores of students. In the POPS test, the categories consisted

of; correctness of answer, methods used, accuracy, extracting information and quality

of explanation. The two categories most similar to "understanding the question" were

methods used and extracting the information. Findings from the POPS test results

indicated that the most difficult procedures for the students, in order of difficulty were

the correctness of answer, extracting information and methods used.


148
The information obtained for this hypothesis established that students have the

most difficultly with understanding and setting up the problem in order to solve it.

Hypothesis 4

Elementary students who participate in a computer troubleshooting curriculum

will increase mathematical problem solving ability.

The information collected from the IOWA Basic Skills math problem solving

and data interpretation section, as well as the group interview demonstrated no

significant findings. However, the group interview provided additional information

showing a possible relationship between mathematical reasoning and computer

troubleshooting. Students made verbal and written references to figuring out the

problem and finding a solution to the problem. One student also made the

comparison that related strategies learned in computer troubleshooting to math by

writing, "I think [computer troubleshooting] helped [my problem solving skills] by

learning strategies like in math." Based on group interview responses, students

believe there is a correlation between mathematical problem solving and computer

troubleshooting.

Based on findings presented in this hypothesis, there is no conclusive

evidence that mathematical ability is affected by learning to troubleshoot and repair

computers.
149
Hypothesis 5

Gender will not impact the problem solving ability of elementary students

involved in a computer troubleshooting curriculum.

The existence of the gender gap in computer usage has been shown through

multiple studies indicating that females lack positive· educational experiences with

computers (Burge, 2001). The major form of assessment used in analyzing possible

gender differences was the Profiles of Problem Solving test. Females in the

experimental group improved by 22% and the females in the control group only

improved by 14%. Likewise the males in the experimental group improved by 12%,

whereas males in the control group decreased by 11% in their total POPS score. It is

difficult to conclude whether males or females were impacted more through the

training sessions, although it is possible to conclude that the training session made an

impact on the experimental group regardless of gender.

The hands-on problem solving test provided additional findings, but the

results were difficult to evaluate. The males in the experimental improved more than

the males in the control group regarding the completion time of the easy problem

from the pre-test to the post-test. However, the females in the control group

improved more than the females in the experimental group regarding the completion

time of the easy problem from the pre-test to the post-test. The number of attempts to

solve each problem was another source of data evaluated to analyze the effects of

gender on the results of the study. The males in the experimental group were the only

group to have increased their average number of attempts in the post-test as compared
150
to their pre-test average attempts. Due to the small sample and the nature of the test,

the conclusions drawn from the hands-on problem solving test were inconclusive.

Hypothesis 6

Students who participate in a computer troubleshooting curriculum rated by

teachers as having high problem solving ability will demonstrate greater

improvements in problem solving ability.

The students rated as the high, medium and low problem solving ability levels

by the teachers in the experimental group improved their total POPS score from the

pre-test to the post-test more than the control group with similar ability levels.

Therefore, the students in the experimental group were able to outperform every

student with equivalent ability levels in the control group, demonstrating the positive

educational impact of the training session. Based on the POPS data, the students

rated with high and medium ability levels were able to achieve better results than

their counterparts in the control group.

Both students rated as low ability problem solvers achieved lower total scores

on their POPS post-test, than on the pre-test. However, the scores of the student in

the experimental group identified as a low ability problem solver decreased less than

the control student classified as a low ability problem solver. Interestingly enough,

both students in the low problem solving group stated in the group interview that they

believed the computer troubleshooting training sessions would have no effect on

problem solving skills. Both students felt computer troubleshooting would have no
151
impact, stating that "because I was just learning [to] put [together] and take apart [a

computer]". The researcher hypothesizes that the students rated as low level problem

solvers by their teachers may only have been able to function at a concrete level of

understanding, unable to relate the abstract similarities of computer troubleshooting

and problem solving.

In the hands-on problem solving test each group showed a similar

improvement in the time completion of the easy problem from the pre-test to the post­

test. More of the low and medium problem solving ability students were able to solve

the difficult problem in the hands-on problem solving test. The researcher

hyp othesizes that students rated as having low and medium ability levels may have

excelled on this portion due to the hands-on manipulation of the test. The hands-on

problem solving test produced results, but the conclusions were questionable due to

small sample numbers.

Based on the findings, students rated as having high, medium and low

problem solving ability by their teachers improved their problem solving ability

within the experimental group as opposed to the control group. However, evaluating

which ability group improved their problem solving ability the most was difficult to

conclude, due to the small sample.

Hypothesis 7

Elementary students who participate in a computer troubleshooting curriculum

will demonstrate greater improvements in problem solving ability than students who

did not participate in the program.


152
The mean total POPS score was averaged for each group. Comparing the pre­

test total score to the post-test total score, the experimental group improved by 20%,

while the control group only improved 3%. The students in the experimental group

demonstrated a larger improvement in all sections of the POPS test, except the

correctness of answer section. A paired samples t-test was run on the total POPS

score data and a significance difference existed between the experimental group's

scores and the control group's scores (See Appendix S). The students in the

experimental group scored significantly higher than the control group.

In the hands-on problem solving test, the experimental group increased their

average number of attempts during both problems from the pre-test to the post-test,

while the control group decreased in their average number of attempts. The difficult

problem was solved by 66% of the experimental group, while only 33% of the control

group was able to solve the difficult problem. The findings from the hands-on

problem solving test was inconclusive, because while the experimental group

contained more students who solved the difficult problem, the numbers were too

small to come to any firm conclusion.

The group interview provided information about whether students believed

computer troubleshooting made an impact on their problem solving skills. Five of the

six students in the experimental group believed that the computer troubleshooting

training made a difference in their problem solving skills. Relationships were

established between key elements of problem solving and computer troubleshooting

by the students' verbal and written responses.


153
Based on the evidence found in the study, the researcher believes computer

troubleshooting training made a difference in the general problem solving skills of the

experimental students. The author concludes that the computer troubleshooting

sessions had a positive educational impact on elementary students' problem solving

abilities.

Recommendations for Further Research

The researcher encountered many positive occurrences throughout the thesis

process. The Human Subjects International Review Board (HSIRB) at Western

Michigan University was extremely helpful in assisting in the research design

process. The Board offered to meet with the researcher on several different

occasions, constantly offering suggestions and educating the researcher on research

design. Western Michigan University also loaned three surplus computers to

the study, in order for the students to gain more hands-on experience with the internal

hardware components.

The Dearborn Public School District was helpful in offering Haigh

Elementary School as a setting for the research project. The building and resources

were available to the researcher, as well as the supervision of an educator in the

Dearborn Public Schools. Many students were interested in participating in the

research study. The principal, as well as several parents stated that the study was a

"wonderful educational opportunity for the children". The overall response to the

study was overwhelming and positive.


154
The setting for the research study, located inside a classroom at Haigh

Elementary School, was ideal and provided a chalkboard, desks, paper, pencils and

additional working computers. Students had constant hands-on opportunities to work

with the computers because of the additional computers from the University and the

computers already located in the classroom.

The curriculum and the instructor were additional positive factors in the study.

The curriculum was tailored to an elementary student level, incorporating excessive

amounts of hands-on activities and simplified instruction. The instructor was A+

certified and a certified elementary educator, meeting the instructional needs of the

students.

The researcher believes in the strong relationship between computer

troubleshooting and problem solving skills. The computer troubleshooting training

has multiple implications for the elementary curriculum, as well as for providing

assistance to teachers in the computer labs. However, there were many limitations

within the research project that need to be eliminated in future research.

The sample size of this study was created for convenience purposes. With any

study, the more subjects that participate, the greater the ability to generalize and the

more creditable the statistical tests are to determine the findings. Since the student

sample in this study was small, only twelve students, the findings could not be

generalized widely. The study did not allow for a completely random sample, and

therefore, cannot be generalized to the entire fifth grade population in Dearborn. The

location, demographics of the students, and other variables also limited the
155
implications of the research. By limiting the current study to only six students in each

group, the researcher was able to provide ample amounts of hands-on, individualized

instruction, as well as manage the large amounts of data collected from each student.

The researcher also encountered difficulties gaining access to students, but was

grateful to the Dearborn Public Schools for allowing the research project to take

place. The time available to assemble and prepare the sample groups was limited

based on the researcher's need to complete the study before the end of the school

year. Future studies should consider implementing a sample size of one hundred

twenty subjects, more time to conduct the training, and more resources for the

collection of the data.

The matching of students based on the disaggregates within each group was

an additional challenge. The students were matched according to gender, attendance

records and problem solving ability. Gender· was not found to be a significant

variable in the findings of the study. Attendance, used to exclude participants from

the research project, was not a factor in the study. The researcher assigned students

as high, medium or low problem solving ability status, based on ratings from their

classroom teachers. The sample was taken from three separate fifth grade

classrooms. Since there was no set definition of high, medium or low ability,

teachers created the ratings based on their own classroom observations and

assessments. Pre-testing students before assigning groups would alleviate the

problem to control this variable. Through pre-testing, findings from the research

could be focused more on directly comparing total scores, rather than comparing the
156
improvement m points for each form of assessment. The researcher would

recommend in future studies that an appropriate problem solving test be used to

categorize students.

Problem solving skills were difficult to monitor, assess and evaluate. The

most common form of analysis requires subjects to verbally state every step they use

during the problem solving process and the reasoning for those processes. However,

due to time restraints and difficulty for elementary students to articulate their

processes and reasoning, the researcher believed videotaping a hands-on problem

solving test would be a more valid and effective measure of problem solving skills.

The researcher was unable to locate many problem solving tests, and the testing

methods retrieved from ETS, Educational Testing Services, were not endorsed.

The computer troubleshooting curriculum used in this study also offered

additional limitations. Because the curriculum was designed and taught by the

researcher, certain biases of the researcher were likely to be a part of the study. In

future studies, care should be taken to eliminate all possible researcher biases. The

researcher feels that curriculum revisions based on suggestions and videotaping of the

instructional sessions could be used to improve the curriculum.

A reliable, cost efficient hands-on problem solving test was the most difficult

test to identify. The researcher developed a hands-on problem solving assessment

involving the use of tangrams. Hands-on intelligence quotient tests were available,

but due to the cost of the tests and lack of funding, alternate tests were developed by
157
the researcher. The researcher would recommend that additional effort be made for

finding a more appropriate hands-on problem solving test for future research.

The IOWA Basic Skills test and the Profiles of Problem Solving test were

retrieved from the ETS, Educational Testing Services, but were not endorsed. The

researcher attempted to select tests that had been used in multiple dissertations, thesis

and research projects, to improve the credibility of the study. The researcher would

recommend that additional effort be made to find more appropriate standardized tests

for future research.

The researcher was also under time constraints such as; the end of the school

year, HSIRB deadlines and University deadlines for the project. Future research

projects should expand the time frame to four weeks. If students could be provided a

longer training period and more tailored instruction, the students may be able to

achieve scores on standardized tests indicating a greater increase in problem solving.

Final Conclusions

Based on the evidence presented in this study, elementary students are capable

of learning how to and can troubleshoot common computer problems. The most

difficult part of the problem solving process for elementary students is

"understanding the question" or the problem. When students are provided with a

concrete hands-on problem, it is much easier for the student to establish a problem

and develop strategies and methods to solve the problem. The computer becomes a

valuable tool for the student to manipulate, simplifying the establishment of the

problem and the problem solving procedure. The computer troubleshooting process
158
also provides students with immediate feedback as to whether they successfully

solved the problem. The researcher believes there is a definite relationship between

computer troubleshooting skills and general problem solving skills. The similarities

of identifying the problem, devising a solution and fixing the problem successfully

exist in both computer troubleshooting and other forms of problem solving. The

computer troubleshooting training has multiple implications for the elementary

curriculum, as well as providing assistance to teachers in the computer labs. With

further research, computer troubleshooting has the potential ability to enhance the

problem solving learning experiences in elementary schools.


BIBLIOGRAPHY

Bodner, G. & Domain, D. (2000). Mental Models: The Role of Representations in


Problem Solving in Chemistry. University Chemistry Education, 4 (1) 22-28.

Burge, K. (2001). UCI Computer Arts: Building Gender Equity while Meeting !STE
NETS. Paper presented at the In Building the Future, NECC 2001, Chicago, IL.

Casey, B. & Tucker, E. (1994). Problem-Centered Classrooms. Phi Delta Kappan,


76 (2), 139-143.

Chaika,G. (1999). Technology in Schools: Some say it doesn't compute! Education


World. Retrieved April 1, 2003, from http://www.education-
world.com/a admin/admin12 l .shtml

Chang, S., Hanson, B., & Harris, D. (2001). A Comparison of the Standardization
and IRT Methods of Adjusting Pretest Item Statistics Using Realistic Data. Paper
presented at the Annual Meeting of the Educational Research Association, Seattle,
WA

Chapman, B. & Allen, R. (1994). Teaching Problem Solving Skills Using Cognitive
Simulations in a PC-Environment. Journal of Interactive Instruction Development,
Spring, 24-30.

Coleman, C., King, J., Ruth, M., & Stary, E. (2001). Developing Higher-Order
Thinking Skills through the Use of Technology. Master of Arts Action Research
Project, Saint Xavier University and Skylight Professional Development Field-Based
Mater's Program.

Coley, R. J. (1997) Computers and Classrooms: The Status of Technology in US.


Schools. Educational Testing Services.

Custer, Rodney. (1999). Design and Problem Solving in Technology Education.


NASSP Bulletin, 83 (608), 24-33.

Du, Y. (2002). Sampling Theory and Confidence Intervals for Effect Sizes: Using
ESCJ to Illustrate "Bouncing" Confidence Intervals. Paper presented at the Annual
Meeting of the Southwest Educational Research Association, Austin, TX.

Eisenberg, M., & Johnson, D. (2002). Leaming and Teaching Information


Technology- Computer Skills in Context. US.; New York; Office of Educational
Research and Improvement. 6.

159
160

Fey, Marion Harris. (2001). Gender and Technology: A Question of Empowerment.


Reading and Writing Quarterly: Overcoming Learning Difficulties, 17 (4), 357-61.

Frantom, C., Green, K. & Hoffinan, E. (2002). Measure Development: The


Children's Attitudes toward Technology Scale (CATS). Journal of Educational
Computing Research, 26_ (3), 249-63.

Goldin, G. (1992). Meta-analysis of Problem Solving Studies: A Critical Response.


Journal for Research in Mathematics Education, 23, (3) 274-283.

Goldman, S., Cole, K. & Syer, C. (1999). The Technology/Content Dilemma. Paper
presented at the Secretary's Conference on Educational Technology-1999. Retrieved
April 2, 2002 from
http://www.ed.gov/Technology/TechConf/1999/whitepapers/paper4.html

GreatSchools Inc. (2003) School information for Haigh Elementary School in


Dearborn Michigan. Retrieved April 1, 2003 from http://www.greatschools.net/

Hembree, R. (1992). Experiments and relational studies in problem solving: A meta­


analysis. Journal for Research in Mathematics Education, 23, (3) 242-273.

Howard, B., McGee, S., & Shin, N. (2001). The triarchic theory of intelligence and
computer-based inquiry learning. Educational Technology Research and
Development, 49 (4), 49-69.

Jereb, Janez. (1996). The Technical Problem and Its Didactic Function. Paper
presented at the Jerusalem International Science and Technology Education
Conference on Technology for a Changing Future: Theory, Policy and Practice.

Johnson, Scott. (1995). Understanding Troubleshooting Styles to Improve Training


Methods. Paper presented at the American Vocational Association Convention.

Jonassen, David. (2000). Toward a design theory of problem solving. Educational


Technology Research and Development, 48 (4), 63-85.

Kids Domain Computer Connections (2002). Retrieved February 6, 2003 from


http://www.kidsdomain.com/brain/computer/lesson.html

Kirkpatrick, H. & Cuban, L. (1998) Computers make kids smarter - right? Technos
Quarterly, 7 (2).

Koszalka, T., Song, H, & Grabowski, B. (2002). Examining Learning


Environmental Design Issues for Promoting Reflective Thinking in Web-Enhanced
PBL. Paper presented at the Annual Meeting of the American Educational Research
Association.
161

Krebs, D. & Clark, B. (2000). Camp Invention: connects to classrooms. Gifted Child
Today, 23 (3), 28-32, 52.

Kurland, D. (1986) A Study of the Development of Programming Ability and


Thinking Skills in High School Students. Journal of Educational Computing
Research, 2 (4), 429-458.

Lavonen, J., Meisalo, V. & Lattu, M. (2001). Problem Solving with an Icon
Oriented Programming Tool: A Case Study in Technology Education. Journal of
Technology Education, 12 (2), 21-34.

Lee, Lung-Sheng Steven. (2000). Technology Education Reform in Taiwan. Paper


presented at the Annual Meeting of the Australian Council of Education through
Technology and Biennial International Conference on Technology and Education.

Lee, Lung-Sheng Steven. (1996). Problem-solving as Intent and Content of


Technology Education. Paper presented at the Annual Meeting of the International
Technology Education Association.

MacPherson, Randall T. (1998). Factors Affecting Technological Trouble Shooting


Skills. Journal of Industrial Teacher Education, 35 (4), 5-28.

McGrath, D. (1988). Programming and problem solving; Will two languages do it?
Journal of Educational Computing Research, 4 (4), 467-484.

Michael, K. (2001). The Effect of a Computer Simulation Activity versus a Hands­


on Activity on Product Crativity in Technology Education. Journal of Technology
Education, 13, (1), 31-43.

Olkun, S. (2003). Comparing Computer versus Concrete Manipulatives in Leaming


2D Geometry. Journal of Computers in Mathematics and Science Teaching, 22, (1),
43-56.

Pibum, M. (1993). Evidence from Meta-Analysis for an Expertise Model of


Achievement in Science. Paper presented at the Annual Meeting of the National
Association for Research in Science Teaching, Atlanta, GA.

Poris, Steven. (2000). Effects of Computer-Based Cooperative Leaming on the


Problem Solving Skills of Grade Six Students. Dissertation.com. (ISBN: 1-58112-
101-6). Retrieved March 20, 2003, from
http://www.dissertation.com/library/1121016z.htm

Sherwood, R. (2002). Problem-Based Multimedia Software for Middle Grades


Science: Development Issues and an Initial Field Study. Journal of Computers in
Mathematics and Science Teaching, 21, (2), 147-165.
162

Suomala, J.& Alajaaski, J. (2002). Pupils' Problem-Solving Processes in a Complex


Computerized Leaming Environment. Journal of Educational Computing Research,
26 (2) 155-76.

Taconis, R., Ferguson-Hessler, M., & Broekkamp, H. (2000). Teaching Science


Problem Solving: An overview of experimental work. Journal of Research in
Science Teaching, 38, (4), 442-468.

The Computing Technology Industry Association, Inc. (2003) The Computing


Technology Industry Association. Retrieved February 26, 2003 from
http://www.comptia.org/certification/Ndefault.asp.

United States Department of Commerce. (2003). Census of Dearborn Michigan.


Retrieved April 13, 2002, from http://www.census.gov.

Waetjen, Walter. (1993). Entropy and Technological Leaming: A Cognitive


Approach. Journal of Technology Studies, 19 (2), 29-40.

Wenglinsky, H. (1998) Does it Compute? The Relationship Between Educational


Technology and Student Achievement in Mathematics (Report no. ED 425 191).
Educational Testing Services.

Williams, D., Liu, M., & Benton, D. (2001). Analysis of Navigation in a Problem­
Based Learning Environment. Association for the Advancement of Computing in
Education.

Wilson, P. Fernadez, M. & Hadaway, N. (1993). Research Ideas for the Classroom:
High School Mathematics. New York: MacMillian.

Wonocott, Michael. (2001). Technology Literacy. ERIC Clearinghouse on Adult,


Career, and Vocational Education, Columbus, OH.

Zuber-Skerritt, 0. (2002). The concept of action learning. The Learning


Organization, 9 (3), 114-124.
Appendix A

Human Subjects Institutional Review Board

163
164

WESTERN MICHIGAN UNIVERSITY


··- ....,.. ·-�.,""ff'"''

�.


1903-lOOl Celebration

Date: May 7, 2003

To: Howard Poole, Principal Investigator


Ann Ottenbreit, Student Investigator for thesis

From: Mary Lagerwey, Chair (Vl °t_ ';2 �


Re: HSIRB Project Number 03-04-04

This letter will serve as confumation that your research project entitled "Effects of
Computer Troubleshooting on Elementary Students' Problem Solving Skills" has been.
approved under the full category of review by the Human Subjects Institutional Review
Board. The conditions and duration of this approval are specified in the Policies of
Western Michigan University. You niay now begin to implement the research as
described in the application.

. Please note that ypu may only conduct this research exactly in the form it was approved.
You must .seek specific board approval for any changes in this project. You must also
seek reapproval if the project extends beyond the termination date noted below. In
addition . if there arc any unanticipated adverse reactions or unanticipated events
associated with the conduct of this research, you should immediately suspend the project
and contact the Chair of the HSIRB for.consultation.

The Board wishes you success in the pursuit of your research goals.

Approval Termination: April 16, 2004

Walwood Hall, Kalamazoo, Ml 49008-5456


P11111Ec (269) 387-8293 FA! (269) 387-8276
Appendix B

Student Assent Form

165
166

Student Assent Readings

You have been asked to participate in a study entitled "Computer Troubleshooting

and Effects on Elementary Problem Solving." The purpose -of the study is to see if the

training with the Computer Troubleshooting and Repair program will help you with your

problem solving skills and math problem solving skills.

Before the training starts, you will be tested using 2 standardized tests, 1 hands-on

problem solving activity and a survey to tell how you feel about problem solving. You

will also be tested after the program using the 2 standardized tests again, the hands-on

problem solving test and an interview, by Mrs. Ottenbreit, to see if you improved in

problem solving. Even if you agree today to participate by signing this form, you can

change my mind at any time when we begin training or at any time during the training.

You are volunteers and you are free to stop participating whenever you want and there

will be no penalties for quitting.

Your name will not be on any of the forms or videotapes. The researcher will use

a code number instead. The researcher will keep a list of names and code numbers that

will be destroyed once the researchers have recorded the important information. If you

have any questions or concerns about this study, you may contact either Anne Ottenbreit

at 313-516-6217 or Dr. Howard Poole at 269-387-6050 or Sharon Ottenbreit at 313-730-

3130.
Appendix C

Parental Permission Slip (Letter A)

167
168

WESTERN MICHIGAN UNIVERSITY


. H. S. I. R. B.
ApfirevlHI for use !or on• year lrom this "1le:

APR 1 6 2003

x-·-·
····tn � ;;?�r--7
··HSI Ch* 7
Dear Parents,

The fifth grade students at Haigh Elementary School have the opportunity to participate
in a research project conducted by a graduate assistant in educational technology from
Western Michigan University during the month of May. This research project will
include 12 5th grade students who volunteer to be in the study. However, only six
students will be selected for the experimental group, which are the students who will be
assigned to receive the training, and 6 students will be selected to participate in the
control group, which wm not participate in the training sessions. It is important to have
some students who are trained and some who are not trained so we can assess the value
of the training. A selection process will be used to choose the six students to participate
in the project if more than 12 students volunteer. The selection process will be preformed
by the researcher, based on the following criteria: gendet, attendance records, and
problem solving ability. The control group will be an important part of the project,
measuring against the experimental group to see if the training program can make a
difference.

The researcher believes the training sessions will have an affect on the students' problem
solving skills and math ability. These children will learn about the different parts of a
computer and their functions and to be taught to troubleshoot if the computer is not
working. They will be using a hands-on approach and working with an actual computer.
The children will be instrncted by the Western Michigan University graduate assistant
and supervised by Mrs. Ottenbreit. It will require that these students attend a two-week
course, five days a week from May 19th through May 30th, 2003.-. The computer training
will be done in Room 4, Mrs. Ottenbreit's room, from 8:00 A.M. until 8:45 A.M. Prior to
and at the end of the training sessions, these children will be tested to se.e if the computer
training has improved their problem solving skills as opposed to those that did not receive
the training. Your child must be able to attend all ten sessions.

All students who volunteer for the study will be provided with a Creative Co·mputer
Night, making their own animated/narrated story on CD. Each student will be able to
keep the CD. This should be a wonderful opportunity and learning experience for your
child!

The 12 students who participate in the study will be asse§sed 011 their problem solving
skills and math problem solving skills. The pre- and post-tests '>"ill take about one hour
and will be done in a relaxed and positive testing climate. The testing results will be kept
confidential and the students' names will not be used in the research's report. The results
169

WESTERN MICHIGAN UNIVERSITY


H. S. I. R. B.
Ailll'IYH for UIC for ,,,. Ytif lr1m tl,ia iala:

APR 1 6 2003

X '[11 l'J-,HIBd' �
� �-�r
· HS

of the training sessions and the project will be shared with the classroom teachers,
however, no individual student results will be released to the teacher.

If you have any questions, please contact Mrs. Ottenbreit at 730-3130. Please return the
permission slip below by Friday, May 16th, 2003. Thank you.

Sincerely,

---------------------------------------------------------------------------------------------------
Problem Solving Research Project
Western Michigan University

Student

'f.£L I give my child permission to participate in this research


project. I realize that my child will need to participate in testing and attend all ten
training sessions from 8:00 AM until 8:45 AM, Mondays through Fridays,
beginning the week of May 19th through the week of May 30th, 2003 in Room 4 at
Haigh Elementary School.

__ I do not want my child to participate in this project.

S'/;J/rJ.J
Date

_L My child has safety


patrol duty during
these two weeks.
Please try to make
arrangements so my
child can participate.
Appendix D

Parental Consent Form (Letter B)

170
171

WESTERN MICHIGAN UNIVERSITY


. H. S. I. R. B.
A,�1vn for use for ont yur f�m IN. iala:
WSSTERN MICHIOAN UNIVERSITY
APR 1 6 2003
)t" -P1� ,::JC4,7"
· �- � HSIB Chair/
Dear Parent/Guardian,

Your child has been invited to participate in a research project entitled "Effects of Computer
Troubleshooting and Repair on Elementary Problem Solving Skills." The purpose of the study is
to determine the usefulness of computer troubleshooting and repair curriculum in preparing
elementary students in problem solving skil1 development. This project is being conducted to
fulfill Anne Ottenbreit's thesis requirement.

Your permission for your child to participate in this project means that your child will be
administered the IOWA Basic Math Skills Test and the Profiles of Problem Solving
Standardized Test. The testing will take place during May and will involve about one hour. Your
child will also be administered a hands-on problems solving test twice as a pre- and post-test.
The process will be videotaped in order to document problem solving skills. All tests will be
conducted in a positive testing environment by Sharon Ottenbreit. Your child will also be taken
through computer training sessions, which will last 2 weeks starting on May 19th . Your child
will be free at any time -- even during the test administration -- to choose not to participate. If
your child refuses or quits, there will be no negative effect on his/her school programming. The
test results will be used to establish a baseline data collection providing the researcher with
information on current levels of problem solving. The results will help \\-1th subject selection for
the control group and experimental group. Although there may be no immediate benefits to your
child for participating, there may eventually be benefits to the school district and subsequently to
students in technology education programs. The researcher believes the training sessions may
have a positive effect on problem solving skills. If the results of the actual project are found to
be useful, then current technology education program could be modified to include repair and
troubleshooting within the curriculum.

All test data and information will remain confidential. That means that your child's name will be
omitted from all test forms and a code number will be attached. The principal investigator will
keep a separate master list with the names of the children and the corresponding code numbers.
If the researchers find that these two tests are useful for planning your child's programming; they
will share the results with your child's teacher, unless specified otherwise. Once the data are
collected and analyzed, the master list will be destroyed. All other forms will be retained for at
least three years in a locked file in the principal investigator's office. No names will be used if
the results are published or reported at a professional meeting.

The only risks anticipated are minor discomforts typically experienced by children when they are
being tested (e.g., boredom, mild stress owing to the testing situation). All of the usual methods
employed during standardized testing to minimize discomforts will be employed in this study.
The other possible risks are those involved in computer repair (e.g., minor scrapes, and in
extreme rare cases, electrostatic shock). All computer safety methods will be exhibited by the
instructors. More information can be provided if requested on the exact safety measure taken by
Appendix E

Additional Information Requested by Parents

172
173

Risks to Subjects
Potential risks to the subjects are as follows:
1. Subjects could miss academic learning time m the classroom in order to
participate in brief testing.
2. Students could experience frustration or boredom during the testing and/or
training.
3. Subjects could feel uncomfortable about videotaping in the classroom.
4. Students could experience minor scrapes due to the hands-on nature of the
program.
5. Students, in extreme rare cases, could receive an electric static shock.

Protection of Subjects
Possible risks to subjects are extremely limited due to the extremely precautions safety
procedures. Student risks could include slight boredom from the tests. Other risks could
include minor scrapes from computer edges and in severe cases, slight electric static
shocks. However, the instructor has carefully designed the curriculum to reflect safety
precautions expressed in the A+ technician requirements. The instructor is a certified
elementary educator and a certified computer technician, building her own computer, and
is able to monitor safety of the small experimental group. There will also be an
additional elementary school teacher to assist in safety features during the training
sessions and testing sessions.

In order to minimize the potential risks listed above, the following precautionary measure
will be taken:
1. The researcher will attempt to limit testing time, or utilize a period in the day
when students would be conducting non-learning activities. The training will be
taking place before school, so training will not use academic learning time in the
classroom.
2. The subjects will be reminded that the test needs to be completed to the best of
their ability. Teacher will reiterate: "This is not a grade, I just want to find out
what you already know, so I don't teach you the something you already know."
The teacher for signs of boredom or frustration will also monitor subjects during
test taking procedures. Students experiencing these problematic feelings will be
allowed to take a break and try again. If this does not succeed, students may stop
taking the test. The instructor will monitor students experiencing boredom or
frustration during the training procedures. The difficulty of the program will be
adapted accordingly. The students are broken up into pairs and this is easily
manageable, plus the grouping may keep otherwise bored/frustrated students
engaged.
3. The videotape will be set up prior to taping so as not to disrupt the training. If
the student continues to feel uncomfortable, the instructor will move the video
camera to a hidden location.
4. Students will be instructed on the first day how to handle the computer
equipment. Every time before they are allow to interact with the machine, they
will be reminded of the metal parts and how to safely manipulate the equipment.
174

The instructor will model careful manipulation of the materials, to illustrate to the
students how to correctly handle the equipment. Band-aids and first aid kits will
be keep on hand at all times.
5. Students will be instructed on the first day how to handle the equipment. Students
will remember to ground themselves, which requires touching a table or an object
which is not electrostatic charged and to put the ESD strap on, which grounds
them to a table, before working on the computer. There is no supposed risk if the
computer is not plugged in. The instructor will remove all power cords prior to
the beginning of the training and place them in a safe container in the room.
When students reach the point of training to anticipate whether they can
troubleshoot the hardware problems, the instructor will then place the power cord
back into the computer, model the safety techniques, ensure the students complete
the same procedure and then work together. In case of extreme rare
circumstances, a cellular phone will be kept on person at all times, as well as
student emergency slips, in case of emergency.
The instructor is a certified computer technician through CompTIA A+ certification
training and testing. The A+ curriculum learning objectives included:

• This domain requires the knowledge of safety and preventive maintenance. With regard to safety,
it includes the potential hazards to personnel and equipment when working with lasers, high
voltage equipment, ESD, and items that require special disposal procedures that comply with
environmental guidelines. With regard to preventive maintenance, this includes knowledge of
preventive maintenance products, procedures, environmental hazards, and precautions when
working on microcomputer systems.
• Identify the purpose of various types of preventive maintenance products and procedures and
when to use and perform them : Liquid cleaning compounds; Types of materials to clean contacts
and connections; Non-static vacuums (chassis, power supplies, fans).

Identify issues, procedures and devices for protection within the computing environment, including people,
hardware and the surrounding workspace : UPS (Un-interruptible Power Supply) and suppressers;
Determining the signs of power issues; Proper methods of storage of components for future use; Potential
hazards and proper safety procedures relating Lasers High-voltage equipment; Power supply; CRT. Special
disposal procedures that comply with environmental guidelines: Batteries; CRTs; Toner kits/cartridges;
Chemical solvents and cans; MSDS (Material Safety Data Sheet). ESD (Electrostatic Discharge)
precautions and procedures: What ESD can do, how it may be apparent, or hidden; Common ESD
protection devices; Situations that could present a danger or hazard.
(http:/lwww.comptia.org/certification/Aldefault.asp.)
Appendix F

Cover Sheets for Tests

175
176

Student Name:

Code Number:

Circle Test:

IOWA Math Skills


Pre-Test I Post-Test

Problem Solving Standardized Test


Pre-Test I Post-Test

Hands-On Problem Solving Test


Pre-Test I Post-Test

Survey

Interview
Appendix G

Survey

177
178

What do you think about problem solving?

ID Number: -----------------

Date: - - - - - - - - - - - - - - - -
-

Directions: Below are some statements about your own thinking about problem solving.
There are no rights or wrong answers. Use the following scale and write the
number which best describes how much the statement is like you. Please
answer honestly and do not skip any statements.

Not me at all Very little A little like Kind of like A lot like me Describes
like me me me me perfectly
1 2 3 4 5 6

_ _ 1. Before trying to solve a problem I try to compare it to one that I have solved
before.

__ 2. Before trying to solve a problem I identify as many pieces of information that


might be needed for problem solution.

3. I can figure out how to solve a problem without making a plan.

4. I have trouble in solving a problem when I do not know what information is


important from what is not.

__ 5. Before trying to solve a problem I say the information over again in my own
words.

__ 6. Before trying to solve a problem I think of a strategy that might lead to a


problem solution.

__ 7. When I am stuck on a problem I ask myself, "Did I look at all of the important
information in the question?"

Questions 8, 9 and 10 will help me figure out how you solve problems. Answer them in
your own words.

8. What are some of the ways you solve a problem? ____________


179

9. What is the hardest part about solving a problem? _____________

10. When you don't know what the solution is, what can you do? --------
Appendix H

Interview

180
181

What do you think about problem solving?


The Interview

ID Number:

Date: ------------- ----

INTERVIEW FORMAT

Not me at all Very little A little like Kind of like A lot like me Describes
like me me me me perfectly
1 2 3 4 5 6

The interviewer will fill in this information for the student.

__ 1. Before trying to solve a problem I try to compare it to one that I have solved
before.

__ 2. Before trying to solve a problem I identify as many pieces of information that


might be needed for problem solution.

3. I can figure out how to solve a problem without making a plan.

4. I have trouble in solving a problem when I do not know what information is


important from what is not.

_ _ 5. Before trying to solve a problem I say the information over again in my own
words.

__ 6. Before trying to solve a problem I think of a strategy that might lead to a


problem solution.

__ 7. When I am stuck on a problem I ask myself, "Did I look at all of the important
information in the question?"

8. What are some of the ways you solve a problem?


9. What is the hardest part about solving a problem?
10. When you don't know what the solution is, what can you do?
11. Do you think that learning how to troubleshoot a computer helped you with your
problem solving skills? Why or why not?
Appendix I

Code Sheet

182
183

Code Sheet
Access Allowed to Dr. Howard Poole, Anne Ottenbreit and Sharon Ottenbreit

Students will be placed into the slots as they tum in their permission slips to participate in
the project. The control group will be the last six, and will be decided after the testing.
The testing information will be kept in a separate database, assigning information to the
student identification numbers only.

Student Name Student Identification E=Experimental Group


Number C=Control Group
12 E

10 E

6 E

8 E

4 E

2 E

7 C

5 C

1 C

11 C

9 C

3 C
Appendix J

Observational Rubric of Hands-On Problem Solving Test

184
185

Observation of Hands-On Problem Solving Test


(Form ____)
Circle: Pre-Test/ Post-Test
Student Code Number:

First Problem (Easy)


NUMBER OF
ATTEMPTS 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15+

How many times did


the student clear the
board and try a new
approach?

AMOUNT OF TIME

How long did the 0:15 0:30 0:45 1:00


student take to fully 1:15 1:30 1:45 2:00
solve the problem? 2:15 2:30 2:45 3:00
3:15 3:30 3:45 4:00
(Measured in minutes) 4:15 4:30 4:45 5:00
Students receive 5
minutes for each
problem. Did not solve at all --

Did not solve correctly __

Solved correctly __

METHODS

List some methods the student


was using to solve the problem.
186

Second Problem (Difficult)


NUMBER OF
ATTEMPTS 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15+

How many times did


the student clear the
board and try a new
approach?

AMOUNT OF TIME
0:15 0:30 0:45 1:00
How long did the 1:15 1:30 1:45 2:00
student take to fully 2:15 2:30 2:45 3:00
solve the problem? 3:15 3:30 3:45 4:00
4:15 4:30 4:45 5:00
(Measured in minutes)
Students receive 5
minutes for each Did not solve at all --
problem.
Did not solve correctly __

Solved correctly __

METHODS

List some methods the student


was using to solve the problem.
Appendix K

Group Interview and Survey Response

187
188

GROUP INTERVIEW: Transcribed Conversation

RESEARCHER(To the group):The first question is: Read So...ifthat's kind oflike you,
go ahead and fill it out like it says on the top. Do you guys want to fill out the first seven
yourselfand then we'll talk about the last few together
(To another student) You're all done ...I have one more thing for you.
(To the group) When you're done, can you go ahead and flip it over so I know you're all
done?
Talk about bathroom ...
Can you wait just a couple more minutes?
11 walks in.
RESEARCHER: You made it just in time.
2: Why did you come in now?
11: I forgot.
The noise made the talking between the students difficult.
RESEARCHER (to the second graders): Second graders, can you do me a big favor?
Can you keep your voices down because we are trying to do a videotape back here? So
can you do whatever work you are supposed to do, you're supposed to be studying your
spelling words? Thank you.
RESEARCHER (to the group): So tum it over now. Number 8 ifyou tum to the first
page, the back ofthe first page. It's right on the top line there. Go ahead and write down
a few things. Then we're going to share them as a group. Go ahead and flip it over when
your done and then we'll talk about it later. So what are some different ways that you
guys try to solve problems?
4: I look for all ofthe things I need and then I try and figure out the problem
You guys can just put down your pens and stuff down. We're just talking right now.
8: I try and find out what strategies to use.
RESEARCHER: Ok what kind ofdifferent strategies do you use
8: Like um whether to multiply, or add ...
RESEARCHER: Ok so you're trying to extract out all the information.
2: I try and find out all the information that I need and I sometimes ...
189

9: Find out number sentences


RESEARCHER:
1: Reading
3: Reading
11: Um.. .I look at all the information and see what it is telling me to do
4: I look at the problem and see what information I need to solve the equation or to solve
the information
6: Reading
RESEARCHER: Alright sounds good. Anybody else ....What are some ways that you
solve problems? Anything different?
9: Guess and check. Trial and error. I try to works the problem backwards
RESEARCHER: Alright, so you've got lots of different ways to solve problems. Alright.
What about number 9? Read it. What do you think the hardest part is? Go ahead and
write it down and then we'll share.
RESEARCHER: Did you already finish this one?
2: (Shakes her head.)
RESEARCHER: Alright. So what do you guys think the hardest part about solving a
problem is?
4: Maybe the problem doesn't give enough information so you can solve it
5: Not being able to figure out what the problem is asking for and sometimes making a
plan to figure out what strategies to use
?: Not enough information figuring out what it's asking for. Figuring out what it's asking
you to do.
RESEARCHER: Alright so just figuring out what kind of strategy you are going to use
11: Not enough information.
RESEARCHER: Not enough information. Is it usually that there's just not enough
information, or it's hard to figure out what you're using the information for
11: Hard to figure out
RESEARCHER: What the heck they what you to do right?
2: What they want you to find out of the information?
RESEARCHER: Ok so sometimes the question at the end just doesn't make sense
190

8: You don't know what the questions is asking you to do?


RESEARCHER: So we've all got pretty much the same. It'sjust the questions are hard
to figure out exactly what they want us to do. After that we can do it; no problem right?
We make a number sentence or draw a picture. Then we can solve it. Go ahead and go
to number 10.
11: I have 2 8 and 9's, I don't have a number 10.
RESEARCHER: Here you can have mine.
RESEARCHER: So when you don't know what the solution is, what can you do?
Are you all done? So when you don't know what the solution is, what can you do?
5: Make a plan to figure out
RESEARCHER: OK. What kind of plan would you make?
Then multiply or ... ?
RESEARCHER: Ok. Sojust try and take as much information as you can.
You can make
4: Read.
Sounds good
9: Read the problem over and over again.
RESEARCHER: OK
3: Ask for help
2: You can guess and check.
RESEARCHER: How about you 7?
RESEARCHER: The last one is number 11. Read. This is just for the kids who did the
actual troubleshooting, the actual computer training, but if you want to tell me what you
think. If you think it would make a difference, If you want to. You don't have to.
9: Can we put a line through it.
RESEARCHER: Yeah. You canjust put a line through it.
7:No.
RESEARCHER: Why do you think that?
7: Because you were just taking apart a computer it wouldn't really help problem
solving.
191

4: I think yes because we actually learned like, cause you didn't help us that much. You
just kind of took apart the computer and we had to think of all the parts that were missing
and stuff.
8: I had to figure out what was wrong with the computer. And it was a lot like trying to
figure out the problem.
2: Yes because we had to figure out what the problem was. reading
12: No not really because it was just taking apart the computer.
6: I think that it would help with like strategies and stuff because like we had to use
different strategies. read
RESEARCHER: OK so different strategies that you had to use. Why don't you guys try
to tell them a little bit about what we did?
4: the first couple days we were just studying like what parts of the computers were the
computers. We took apart the computers and um ... and put them back together. Then
like the last day, she took apart a computer and we had to put it back together with all the
parts.
6: We had to go to like stations and we had to figure out what it was and fix it.
RESEARCHER: You had to fix it. You had to figure out what it was first and then you
had to fix it. SO you had to identify the problem.
2: We had to install and uninstall a program.
RESEARCHER: Did the people who went through the computer troubleshooting, did
you guys have fun doing that?
All: Yeah.
RESEARCHER: Do you think that if your teacher had a problem with the computer that
you could fix it?
All: Yeah.
RESEARCHER: So now you can help your teacher out in lab?
4: We did it in like art. The printer wasn't working; it was the same problem here. So we
pressed the button and it worked.
RESEARCHER: Alright there you go. That's fantastic guys! Thanks a lot. I really
appreciate it.
Appendix L

Survey and Interview Rubric

192
193

Answer Identifying Understanding Looking Lack of Method or


important the question back information strategy
information
Frequency 3 8 2 2 3
of students'
answers.
Appendix M

IOWA Test Review

194
195

Test Name: Iowa Tests of Basic Skills Forms KL and M


Test Author: Hoover-H; ]2; Hieronymous-A; N; Frisbie-D; A; Dunbar-S; E_
Publication Date: 1955-1996
Scores: Vocabulary, Listening, Language, Language Total, Mathematics, Core Total, Word Analysis
(optional), Mathematics Advanced Skills, Mathematics Total, Reading Advanced Skills, Reading Total,
Reading, Listening Language, Mathematics Concepts, Mathematics Problems, Mathematics Computation
[ optional], Social Studies, Science, Sources of Information, Composite, Language Advanced Skills,
Mathematics Advanced Skills, Survey Battery Total, Reading Comprehension, Spelling, Capitalization,
Punctuation, Usage and Expression, Mathematics Concepts and Estimation, Mathematics Problem Solving and
Data Interpretation, Mathematics Total, Maps and Diagrams, Reference Materials, Sources of Information
Total, Composite
Reviewer: Brookhart-Susan-M; Cross-Lawrence-I-I
Review Indicator: 2 reviews available
Publisher: The Riverside Publishing Company 8420 Bryn Mawr Ave Chicago IL 60631
Acronym: ITBS
Mental Measurements Yearbook: 13 Mental Measurements Yearbook
Accession Number: 13012057

Purpose
"To provide a comprehensive assessment of student progress in the basic skills."
Population
Grades K.1-1.5, K.8-1.9, 1.7-2.6, 2.5-3.5, 3, 4, 5, 6, 7, 8-9. ..LE-10: 5, 6, 7, 8, 9, 10, 11, 12, 13, 14.
Scores
Vocabulary, Listening, Language, Language Total, Mathematics, Core Total, Word Analysis (optional),
Mathematics Advanced Skills, Mathematics Total, Reading Advanced Skills, Reading Total, Reading,
Listening Language, Mathematics Concepts, Mathematics Problems, Mathematics Computation [optional],
Social Studies, Science, Sources of Information, Composite, Language Advanced Skills, Mathematics
Advanced Skills, Survey Battery Total, Reading Comprehension, Spelling, Capitalization, Punctuation, Usage
and Expression, Mathematics Concepts and Estimation, Mathematics Problem Solving and Data Interpretation,
Mathematics Total, Maps and Diagrams, Reference Materials, Sources of Information Total, Composite.
Time
(130-310) minutes for Complete Battery; (100) minutes for Survey Battery
Comments
196

Part of Riverside's Integrated Assessment System; Braille and large-print editions available
Appendix N

POPS Test Review

197
198

Test Name: Profiles of Problem Solving


Test Author: Stacey-Kaye; Groves-Susie; Bourke-Sid; Doig-Brian
Publication Date: 1993
Scores: 5: Correctness of Answer, Method Used, Accuracy, Extracting Information, Quality of
Explanation
Reviewer: McLellan-Mary-J; Medina-Diaz-Maria
Review Indicator: 2 reviews available
Publisher: Australian Council for Educational Research 19 Prospect Hill Road Private Bag 55
Camberwell Victoria 3124 Australia
Acronym: POPS
Mental Measurements Yearbook: 13 Mental Measurements Yearbook
Accession Number: 13081532

Purpose
"An assessment of mathematical problem solving designed for children in upper primary school".
Population
Grades 4-6.
Price
1993 price data: $75 per manual and photocopiable masters.
Administration
Group
Scores
5: Correctness of Answer, Method Used, Accuracy, Extracting Information, Quality of Explanation.
Manual
Manual, 1993, 64 pages
Time

[32]40 minutes
Comments
Appendix 0

Hands-On Problem Solving Test

199
200
201

---·------

· --·~·-------
Appendix P

Curriculum for Computer Troubleshooting Training Sessions

202
203

Week 1:
Day 1: Pre-Testing
Day 2: Pre-Testing/Getting to Know You and the Computer
Day 3: Lesson 1: Outer Hardware, Intro to Hardware on the Inside
Day 4: Lesson 2: Hardware on the Inside
Day 5: Lesson 3: Storage, Files and Folders, The Windows Desktop
Week 2:
Day 1: Lesson 4: Knowing Your System, Programs, Operating Systems, Computer Care
and Safety
Day 2: Lesson 5: Troubleshooting Real Problems
Day 3: Post-Testing
Day 4: Post-Testing
Appendix Q

Procedure to Obtain Consent

204
205

Procedure to Obtain Consent Signature

I. Researcher informs students and parents of project through the initial letter.

II. Parents/students submit permission slip to participate in the testing, participate

in the program, or to participate in neither.

III. Researcher agrees to answer any questions parents or students may have.

IV. If parent wishes to sign the Parental Consent Form:

A. Researcher provides a copy to sign and a copy for the parent to keep.

B. Researcher discusses early morning training and how it is pertinent that

the student be on time everyday at 8:00am.

V. If parent does not wish to sign the Parental Consent Form, there is no further

action taken.
Appendix R

Master's Thesis Timeline

206
207

Master's Thesis Timeline

Task Date

1. Literature Review January 2003-March 2003

2. Summaries of Articles March 15th, 2003

3. Research Grant Funding Proposal March 1 in , 2003

4. Proposal March 31si, 2003

5. Graduation Audit March 31si, 2003

6. HSIRB April 1st , 2003

7. Meeting with Thesis Committee April ih , 2003

8. Obtain Approval for Master's Research Class April ih, 2003

9. Final Development of Curriculum April I in , 2003

10. Distribute permission slip for testing May Ii\ 2003


(Subject Recruitment Letter should be attached)

11. Consent form/permission slip returned for May 13th -May 16th, 2003
participation in computer training

12. Pre-assessment of Control and Experimental May 19tn, 2003


Groups

13. Begin 1st week of training with experimental Mal 19t\ 2ot\ 21si, 22nct, and
group: 23r 2003

Day 1: Testing/Getting to Know You and the


Computer

Day 2: Lesson 1: Outer Hardware, Intro to Hardware


on the Inside

Day 3: Lesson 2: Hardware on the Inside

Day 4: Lesson 3: Bits and Bytes, Storage, Files and


Folders, The Windows Desktop

Day 5: Lesson 4: Knowinf< Your System, Programs,


208

Operating Systems

14. Begin 2nd week of training with experimental May 26tn, 2in, 28tn, 29tn and
group: 30th' 2003

Day 1: Lesson 5: Computer Care and Safety

Day 2: Lesson 6: Intro to Troubleshooting Real


Problems

Day 3: Lesson 7: Troubleshooting Real Problems

Day 4: Final Review

Day 5: Final Testin2

15. Post-assessment of Control and Experimental May 30tn, 2003


Groups

16. Chapter 1-Introduction, June 3rd, 2003


Chapter 2 - Literature Review 10:30 am
Chapter 3 - Methodology

(Committee Meeting with Dr. Poole, Dr. Bosco and


Dr. Leneway)

17. Chapter 4 - Results June Ii", 2003


Chapter 5 - Discussion and Implications 10:30 am

(Committee Meeting with Dr. Poole and Dr.


Leneway)

18. Meeting with Dr. Poole on Final Wrap-Up #1 June 19tn, 2003

19. Meeting with Dr. Poole and Dr. Leneway for July 3rd , 2003
Final Wrap-Up #2

20. Thesis Defense July 9th, 2003


11:00 pm-1:00 pm
Room 3208

21. Thesis Due Date July 181n, 2003


Appendix S

Paired Samples T-Test Results

209
210

All Students T-Test Pre and Post


Paired Samples Statistics

Std. Error
Mean N Std. Deviation Mean
Pair POPS: Total: Pre 25.4167 12 7.26709 2.09783
1 POPS: Total: Post 28.4167 12 9.89452 2.85630

Paired Samples Correlations

N Correlation Sig.
Pair POPS: Total: Pre &
1 POPS: Total: Post 12 .814 .001

Paired Samples Test

Paired Differences
95% Confidence Interval
Std. Error of the Difference
Mean Std. Oeviation Mean Lower I Upper t
Pair POPS: Total: Pre -
1 POPS: Total: Post -3.0000 5.79969 1.67423 -6.6849 I .6849 -1.792

Paired Samples Test

df Sia. /2-tailed)
Pair POPS: Total: Pre -
1 POPS: Total: Post 11 .101

Control T-Test Pre and Post


Paired Samples Statistics

Std. Error
Mean N Std. Deviation Mean
Pair : Total: Pre 6 8.24015 3.36403
1 POPS: Total: Post 6 9.74 .97841

Paired Samples Correlations

N Correlation Sig.
Pair POPS: Total: Pre &
1 POPS: Total: Post 6 .793 .060
211

Paired Samples Teat

Paired Differences -
/
95% Confidence Interval
Std. Error of the Difference

II
Mean Std. Deviation Mean Lower Upper t
Pair POPS: Total: Pre -
1 POPS: Total: Post -.3333 5.95539 2.43128 -a.5831 5.9165 -.137

Paired Samples Teat

df Sig. (2-tailed)
Pair POPS: Total: Pre -
1 POPS: Total: Post 5 .896

Experimental T-Test Pre and Post


Paired Samples Statistics

Std. Error
Mean N Std. Deviation Mean
Pair POP5: Total: Pre 25.3333 6 6.94742 2.83627
1 POPS: Total: Post 31.0000 6 10.21763 4.17133

Paired Samples Correlations

N Correlation Sia.
Pair POPS: Total: Pre &
1 POPS: Total: Post 6 .924 .008

Paired Samples Test

Paired Differences
95% Confidence Interval
Std. Error of the Difference
I
I
Mean Std. Deviation Mean Lower Upper t
Pair POPS: Total: Pre -
1 POPS: Total: Post -5.6667 4.63321 1.89150 -10.5289 -.8044 ·-2.996

Paired Samples Test

df Sig. (2-tailed)
Pair POPS: Total: Pre -
1 POPS: Total: Post 5 .030
212

Means
Case Processing Summary

Cases
Included Excluded Total
N Percent N Percent N Percent
POPS: Total: Pre *
GROUP * GENDER 12 100.0% 0 .0% 12 100.0%
POPS: Total: Post *
GROUP '* GENDER 12 100.0% 0 .0% 12 100.0%

Report

POPS: POPS:
GROUP GENDER Total: Pre Total: Post
C ,, F Mean 23.3333 27.0000
N 3 3
Std. Deviation 8.14453 8.54400
M Mean 27.6667 24.6667
N 3 3
Std. Deviation 9.45163 12.66228
Total Mean 25.5000 25.8333
N 6 6
Std. Deviation 8.24015 9.74508
E F Mean 29.6667 38.0000
N 3 3
Std. Deviation 7.57188 7.54983
M Mean 21.0000 24.0000
N 3 3
Std. Deviation 2.64575 7.54983
Total Mean 25.3333 31.0000
N 6 6
Std. Deviation 6.94742 10.21763
Total F Mean 26.5000 32.5000
N 6 6
Std. Deviation 7.84219 9.39681
M Mean 24.3333 24.3333
N 6 6
Std. Deviation 7.20185 9.33095
Total Mean 25.4167 28.4167
N 12 12
Std. Deviation 7.26709 9.AQ452

Both Pre and Post Test: Experimental vs. Control: T-Test


Paired Samples Statistics

Std. Error
Mean N Std. Deviation Mean
Pair PRE_1.,;uNT 25.5000 6 8.24015 3.36403
1 PRE_EXP 25.3333 6 6.94742 2.83627
Pair POST_CON 25.8333 6 9.74508 3.97841
2 POST EXP 31.0000 6 10.21763 4.17133
213

Paired Samples Correlations

N Correlation Sio.
Pair1 PRE_CONT &PRE_EXP 6 -.583 .224
Pair2 POST CON &
POST-EXP 6 -.538 .271

Paired Samples Test

Paired Differences
95% Confid ence Interval
Std. Error of the Difference
Mean Std. Deviation Mean Lower Upper
Pair1 PRE_CONT-PRE_EXP .1667 13.52652 5.52218 -14.0285 14.3619
Pair2 POST CON-
-5.1667 17.50905 7.14804 -23.5413 13.2079
POST-EXP

Paired Samples Test

t df Sio. (2-tailed)
Pair 1 PRE_�uNT-PRE_EXP .030 5 .977
Pair2 POST CON-
-.723 5 .502
POST-EXP
Appendix T

Michigan Curriculum Frameworks

214
215

Michigan Curriculum Frameworks for


Technology

OVERVIEW OF TECHNOLOGY
CONTENT STANDARDS
All students will:
Use and transfer technological knowledge and skills for Using and
life roles (family member, citizen, worker, consumer, Transferring
lifelong learner);

Use technologies to input, retrieve, organize, manipulate, Using


evaluate, and communicate information; Information
Technologies

Apply appropriate technologies to critical thinking, Applying


creative expression, and decision-making skills; Appropriate
Technologies

Employ a systematic approach to technological solutions Employing


by using resources and processes to create, maintain, and Systematic
improve products, systems, and environments; Approach

Apply ethical and legal standards in planning, using, and Applying


evaluating technology; and Standards

Evaluate the societal and environmental impacts of Evaluating and


technology and forecast alternative uses and possible Forecasting
consequences to make informed civic, social, and
economic decisions.

More detailed information concerning the technology standards can be found at:
http://www.michigan.gov/documents/Technology 11594 7.htm
Standard 1.2 Variability and Change
216

Students describe the relationships among variables, predict what will happen to one variable as another
variable is changed, analyze natural variation and sources of variability, and compare patterns of change.

Variability and change are as fundamental to mathematics as they are to the physical world, and an
understanding of the concept of a variable is essential to mathematical thinking. Students must be able to
describe the relationships among variables, to predict what will happen to one variable as another variable
is changed, and to compare different patterns of change. The study of variability and change provides a
basis for making sense of the world and of mathematical ideas.

Strand III. Data Analysis and Statistics


We live in a sea of information. In order not to drown in the
data that inundate our lives every day, we must be able to
process and transform data into useful knowledge. The ability
to interpret data and to make predictions and decisions based
on data is an essential basic skill for every individual.

Standard 111.1 Collection, Organization and Presentation of Data


Students collect and explore data, organize data into a useful form, and develop skill in representing and
reading data displayed in different formats.

Knowing what data to collect and where and how to collect them is the starting point of quantitative
literacy. The mathematics curriculum should capitalize on students' natural curiosity about themselves and
their surroundings to motivate them to collect and explore interesting statistics and measurements derived
from both real and simulated situations.
Once the data are gathered, they must be organized into a useful form, including tables, graphs, charts and
pictorial representations. Since different representations highlight different patterns within the data,
students should develop skill in representing and reading data displayed in different formats, and they
should discern when one particular representation is more desirable than another.

Standard 111.3 Inference and Prediction


Students draw defensible inferences about unknown outcomes, make predictions, and identify the degree of
confidence they have in their predictions.

Based on known data, students should be able to draw defensible inferences about unknown outcomes.
They should be able to make predictions and to identify the degree of confidence that they place in their
predictions.

Standard V.1 Operations and Their Properties


Students understand and use various types of operations (e.g., addition, subtraction, multiplication,
division) to solve problems.
The ultimate reason for mastering the operations of arithmetic and algebra is to solve problems. To that
end, understanding the basic computational operations and their algorithms is essential for competence in
mathematics, but the emphasis must be on understanding and using the operations, not on memorizing
algorithms. In computation, understanding and accuracy are always more important than speed.
Understanding the operations requires the concomitant understanding and application of the properties of
those operations, and it involves knowing ability to represent computations with manipulatives and
geometric models; and the discernment of which computational method to use in a given situation.
Computational methods also involve estimating and assessing the reasonableness of the results of a
computation.

Standard V.2 Algebraic and Analytic Thinking


Students analyze problems to determine an appropriate process for solution, and use algebraic notations to
model or represent problems.
217

Mathematical representations allow us to visualize and understand problems. These representations may
be numerical, literal, symbolic, graphical, pictorial or physical. Facility with multiple representations of
numerical and algebraic concepts and relationships is essential to mathematical competence. This includes
the development of "symbol sense" as well as "number sense" and the understanding that the notion of
solution involves a process as well as a product. Thus, the solution of a mathematical problem requires
both an understanding of the question for which an answer is sought and the development of a strategy to
obtain that answer. The context of the problem determines the nature and the degree of precision of the
required solution. The increasing use of quantitative methods in all disciplines has made algebra the
fundamental tool for mathematical applications. Algebraic thinking is learned most effectively when it is
studied in the context of applications, both mathematical and real-world, that reveal the power of algebra
to model real problems and to generalize to new situations. Students should use algebraic techniques to
analyze and describe relationships, to model problem situations, and to examine the structure of
mathematical relationships. The algebra curriculum should employ contemporary technology, including
spreadsheets and graphical analysis, to emphasize conceptual understanding of algebra and analytic
thinking as sophisticated means of representation and as powerful problem-solving tools.

Standard VI.2 Discrete Mathematics


Students investigate practical situations such as scheduling, routing, sequencing, networking, organizing
and classifying, and analyze ideas like recurrence relations, induction, iteration, and algorithm design.
Discrete (discontinuous) mathematics has grown in significance in recent years and today has applications
in many important practical situations such as scheduling, routing, sequencing, networking, organizing
and classifying. Important ideas like recurrence relations, induction and algorithm design also have
practical applications in a variety offields. Computers, which are finite, discrete machines, require an
understanding of discrete mathematics for the solution of problems using computer methods.
Appendix U

Typical 5 th Grade Problem Solving by NTCM

218
219

Typical 5th Grade Problem Solving Ability

Reflecting on different ways of thinking about and representing a


problem solution allows comparisons of strategies and consideration of
different representations. For example, students might be asked to find
several ways to determine the number of dots on the boundary of the square
in figure 2.1a and then to represent their solutions as equations (Bums and
Mclaughlin 1990).

••
•• ••••• • • •••
••• ••
• •
••••••••••••
The "dot square" problem

Figure 2.la

Students will likely see different patterns. Several possibilities


are shown in figure 5.28. The teacher should ask each student to relate
the drawings to the numbers in their equations. When several different
strategies have been presented, the teacher can ask students to examine
the various ways of solving the problem and to notice how they are alike
and how they are different. This problem offers a natural way to
introduce the concept and term equivalent expressions.

In addition to developing and usmg a variety of strategies,


students also need to learn how to ask questions that extend problems. In
this way, they can be encouraged to follow up on their genuine curiosity
about mathematical ideas. For example, the teacher might ask students
to create a problem similar to the "dot square" problem or to extend it in
220

some way: If there were a total of 76 dots, how many would be on each
side of the square? Could a square be formed with a total of 75 dots?
Students could also work with extensions involving dots on the
perimeter of other regular polygons. By extending problems and asking
different questions, students become problem posers as well as problem
solvers.

.••
�I
•·• · ·· •·•·,�r;
,.... •••



•• ••
• ••
•cf••·••°!),• -
.:. ........-
•r.-------J'
4x8+4-36 4x10-4-36 10+8+10+8""36

Fig. 2.1 b Several possible solutions to the "dot square" problem


I

Appendix V

Email Documentation from Kids Domain.Com

221
222

Thank you toc youz request. By all ae:ans, ple:iu,e: fe:e:l fi:ee: to aodel youc
coaputet curcicu.1ua a!tet W.e lesson., on Kids Doaa.in • Ple�e include
acknovledc;;r:aent ot our vebsite 1t you ceprint and handout any ot our content.

continue to enjoy all thot the Ke.boo:,e Netvock has to

http://www.kaboose.can
Toe Kaboose Jletwo:c:k - �t On Baaed!
Fun!chool. coa - Kid!:do:aain. coa - Zeelr:s. coa
Appendix W

Spreadsheets of Answers for Worksheets and Worksheet Samples

223
224

Station #6: Laptop Trauma

I lost a file! I saved a file, but I can't remember what I called it.
I know the title at the top of the page was:

Lost Dog

Can you find it for me? Where was it?

While you're finding my file, I need my background changed.


Is there any way you can change my background on the
desktop? How did you do it?

Station #1

Team #1: "Plug in printer and put in paper."

Team #2: First it wasn't plugged in and the USB wasn't plugged in. There was no paper.

The printer was not plugged in to the [monitor]."


225

Team #3: "What's wrong with your printer is the plug wasn't in and there was no paper.

So you need to put some in."

Station #2

Team #1: "Plug in any plugs, put in CD and pushed yes, You put the CD in and clicked

on yes to install it."

Team #2: "One problem we had was the mouse was not plugged in. We first went to

installer then we pushed continue and it installed them. We restarted the computer."

Team #3: Did not complete due to time constraints.

Station #3

Team #1: "Monitor won't tum on because it had no power and it was not plugged in to

the power tower."

Team #2: "The monitor is not working. It is not working because the [monitor] is not

plugged into the power tower."

Team #3: "[The problem is the] monitor won't tum on because the power cable isn't

plugged into the monitor."

Station #4

Team #1: "The sound cable is not plugged in. The ram is missing. The trackball for the

mouse isn't in. Plug in [the] mouse, keyboard and monitor. The power cable is not

plugged in.

Team #2: "The ram is missing (1). The wire is not plugged in. The p5 ( the internal

power cord) wire is not plugged in. The [monitor] is not plugged into the power tower.
226

The mouse is not plugged in. The trackball is missing. The power button is not there.

The power cord."

Team #3: "Sound cable. More ram. Trackball. Plug in mouse and keyboard and

monitor and power cable plug in."

Team #1: "You could not do anything or hear anything."

Team #3: "You could [not do] anything or [hear] anything."

Station #6

Team #1: "It is under Microsoft Word."

Team #2: "Yes we found it. We went to search and pushed files and folders then we

typed in lost dog."

Team #3: "It was in Microsoft Word."

Team #1: "You right click anywhere then you click properties. Then you go to desktop

and change the background."

Team #2: "Yes. We right clicked then we went to properties. Then we clicked on

desktop and changed it."

Team #3: "Yes. I went under desktop and found it."

You might also like