20230222 Learning_Programming_by_Solving_Problems
20230222 Learning_Programming_by_Solving_Problems
net/publication/221545525
Conference Paper in IFIP Advances in Information and Communication Technology · January 2002
DOI: 10.1007/978-0-387-35619-8 · Source: DBLP
CITATIONS READS
30 8,215
1 author:
Amruth Kumar
Ramapo College
198 PUBLICATIONS 1,757 CITATIONS
SEE PROFILE
Some of the authors of this publication are also working on these related projects:
All content following this page was uploaded by Amruth Kumar on 16 May 2014.
Amruth N. Kumar
Mahwah, NJ 07430-1680
1. ABSTRACT
We have been developing tutors to help students learn programming concepts by solving
problems. In this paper, we will discuss the use of problem-solving in
Computer Science, the effectiveness of using problem-solving tutors to learn
programming concepts, and the pedagogical relationship between solving
problems and learning to write programs. We will also present the design and
results from the evaluation of one of our tutors.
Our tutor on C++ pointers presents C++ programs and asks the user to
indicate whether the program contains any dangling pointers, lost objects,
semantic errors (printing values of uninitialized variables), syntax errors, etc.
We have evaluated this tutor in several sections of our Computer Science II
course. In this section, we will present the results of these evaluations,
addressing both cognitive and affective aspects of learning with the tutor.
Tutor Versus Printed Workbook: In Spring 2001, we again tested the tutor
in two sections (N=33 combined), using the pretest-practice -posttest
protocol. We conducted a controlled test – between the tests, the control
group practiced with printed workbooks, whereas the test group practiced
with the tutor. The author was not the instructor in the sections. The pre-test
and post-test scores were out of 40. The results are presented in the Table
below.
Practicing with the tutor appeared to be slightly better than practicing with
the printed workbook, although the difference is not statistically significant.
The results seem to indicate that simulative demand feedback may not be
any better than minimal feedback. Informal comments from the students
seemed to suggest that simulative feedback is too verbose. All the same, it is
encouraging to see that using the tutor did help the students improve their
performance.
Question 1 indicates that the tutor was easy to learn if we use the control
group’s score as the basis, since presumably, students do not need to “learn”
how to use a printed workbook designed like a typical textbook. The
problems in the printed workbook were themselves generated by the tutor,
and the results for Question 2 validate this. Questions 3 and 4 seem to
indicate a slight Hawthorne effect in that students using the online tutor felt
the problems were more interesting and less repetitive and boring, although
the types of problems were the same for both the tutor and the printed
workbook. Question 5 clearly indicates the superiority of the tutor, which
provided detailed problem-specific feedback whereas the printed workbook
just listed the correct answer for each problem. Questions 6 and 7 indicate
that the tutor facilitated better affective learning than the printed workbook,
which is encouraging. Questions 7 through 10 clearly indicate the students’
preference for the tutor over the traditional printed workbook.
6. FUTURE WORK
It is clear from the improvement from pre-test to post-test scores, that
students learn how to solve problems using our tutors. We would like to test
whether this improvement in problem-solving ability translates to better
ability to write programs.
Pennington [32] found that a cross-referenced mental representation,
containing a balanced mix of program and domain model is associated with
better program comprehension. She also found that modification tasks
promoted the development of a cross-referenced mental representat ion. Our
tutors currently do not ask the users to modify the programs, only to debug
or predict their output. We may include program modification as another
activity in our tutors in the future.
Self-generated elaborations are better than text-supplied elaborations for
learning [34]. In other words, if the user is provided with an environment in
which the user can construct his/her own explanation for a program, the user
will benefit more than if the tutor generates all the explanations. It would be
an interesting exercise to incorporate this meta-cognitive reasoning into our
tutors.
7. ACKNOWLEDGMENT
Partial support for this work was provided by the National Science
Foundation’s Course, Curriculum and Laboratory Improvement Program
under grant DUE-0088864.
This work was supported in part by a grant from the Ramapo College
Foundation.
8. REFERENCES
[1] Anderson, J.R., Corbett, A.T., Koedinger, K.R., Pelletier, R. “Cognitive Tutors: Lessons
Learned”. The Journal of the Learning Sciences. Vol 4(2), 167-207, 1995.
[2] Arnow D. and Barshay, O., WebToTeach: An Interactive Focused Programming Exercise
System, In proceedings of FIE 1999, San Juan, Puerto Rico (November 1999), Session
12a9.
[3] Baldwin, D., Three years experience with Gateway Labs, Proceedings of ITiCSE 96,
Barcelona, Spain, June 1996, 6-7.
[5] Bloom, B.S. and Krathwohl, D.R. Taxonomy of Educational Objectives: The
Classification of Educational Goals, by a committee of college and university examiners.
Handbook I: Cognitive Domain, New York, Longmans, Green, 1956.
[6] Bloom, B.S.: The 2 Sigma Problem: The Search for Methods of Group Instruction as
Effective as One-to-One Tutoring. Educational Researcher, Vol 13 (1984) 3-16.
[7] Bridgeman, S., Goodrich, M.T., Kobourov, S.G., and Tamassia, R., PILOT: An
Interactive Tool for Learning and Grading, Proceedings of the 31st SIGCSE Technical
Symposium, Austin, TX, March 2000, 139-143.
[8] Bridgeman, S., Goodrich, M.T., Kobourov, S.G., and Tamassia, R., SAIL: A System for
Generating, Archiving, and Retrieving Specialized Assignments Using LaTeX,
Proceedings of the 31st SIGCSE Technical Symposium, Austin, TX, (March 2000), 300-
304.
[9] Brown, D.J., Writing Web-Based Questions with Mallard, Proceedings of FIE 1997,
Pittsburgh, PA (November 1997).
[10] Campbell, J.O., Evaluating Costs and Benefits of Distributed Learning, Proceedings of
FIE 1997, Pittsburgh, PA (November 1997).
[12] Corritore, C.L. and Widenbeck, S. What do Novices Learn During Program
Comprehension? International Journal of Human-Computer Interaction, 1991, 3(2), 199-
222.
[14] Haajanen, J., Pesonius, M.., Sutinen, E., Tarhio, J., Terasvirta, T., and Vanninen, R.,
Animation of User Algorithms on the Web, IEEE Symposium on Visual Languages, 1997,
360-367.
[16] Johnson-Laird, P.N. Mental Models: Towards Cognitive Science of Language, Inference
and Consciousness. Cambridge University Press, Cambridge, 1983.
[17] Kashy, E., Sherrill, B.M., Tsai, Y.., Thaler, D., Weinshank, D., Engelmann, M., and
Morrissey, D.J., CAPA, An Integrated Computer Assisted Personalized Assignment
System, American Journal of Physics, Vol 61(12), 1993, 1124-1130.
[18] Kashy E., Thoennessen, M., Tsai, Y., Davis, N.E., and Wolfe, S.L. Using Networked
Tools to Enhance Student Success Rates in Large Classes. In Proceedings of FIE ‘97
(Pittsburgh, PA, November 1997), IEEE Press.
[19] Kohne, G.S., An Autograding (Student) Problem Management System for the
Compeuwtir Illittur8, Proceedings of ASEE Annual Conference, June 1996 (CD ROM).
[20] Krishna A. and Kumar A. A Problem Generator to Learn Expression Evalua tion in CS I
and Its Effectiveness. The Journal of Computing in Small Colleges, Vol 16(4), 2001, 34-
43.
[21] Kumar A. Learning the Interaction between Pointers and Scope in C++, Proceedings of
The Sixth Annual Conference on Innovation and Technology in Computer Science
Education (ITiCSE 2001), Canterbury, UK, (June 2001), 45-48.
[22] Kumar A.: Dynamically Generating Problems on Static Scope, Proceedings of The Fifth
Annual Conference on Innovation and Technology in Computer Science Education
(ITiCSE 2000), Helsinki, Finland, (July 2000), 9-12.
[23] Kumar, A, Schottenfeld, O. and Obringer, S.R. Problem Based Learning of 'Static
Referencing Environment in Pascal, Proceedings of the Sixteenth Annual Eastern Small
College Computing Conference (ESCCC 2000), University of Scranton, PA, 10/27-
28/2000, pp 97-102.
[24] Singhal N., and Kumar A., “Facilitating Problem -Solving on Nested Selection
Statements in C/C++”, Proceedings of FIE ’00, Kansas City, MO, October 2000, IEEE
Press.
[25] Shah, H. and Kumar, A., “A Tutoring System for Parameter Passing in Programming
Languages”, Proceedings of The Seventh Annual Conference on Innovation and
Technology in Computer Science Education (ITiCSE 2002), Aarhus, Denmark, (June
2002).
[26] Littman, D.C., Pinto, J., Letovsky, S., and Soloway, E. Mental Models and Software
Maintenance. In E. Soloway and S. Iyengar (Eds.), Empirical Studies of Programmers,
1986, Ablex Publishers, Norwood, NJ, 80-98.
[27] Liu, M.L., and Blanc, L., On the retention of female Computer Science students,
Proceedings of the 27th SIGCSE Technical Symposium, Philadelphia, PA, March 1996,
32-36.
[28] Mann, P., Suiter, P., and McClung, R., A Guide for Educating Mainstream Students,
Allyn and Bacon, 1992.
[29] McConnell, J., Active Learning and its use in Computer Science, Proceedings of ITiCSE
96, Barcelona, Spain, June 1996, 52-54.
[30] Naps, T.L., and Stenglein, J., Tools for Visual Exploration of Scope and Parameter
Passing in a Programming Languages Course, The Proceedings of 27 th SIGCSE
Technical Symposium on Computer Science Education, February 1996, 305- 309.
[31] Naps, T.L., Eagan, J.R.. and Norton, L.L. JHAVE – An Environment to Actively Enhage
Students in Web-Based Algorithm Visualizations. Proceedings of the 31st SIGCSE
Technical Symposium, Austin, TX, March 2000, 109-113.
[34] Reder, L., Charney, D., and Morgan, K. The Role of Elaborations in Learning a Skill
from Instructional Text. Memory and Cognition. 14: 64-78, 1986.
[36] Rodger, S., and Gramond, E., JFLAP: An Aid to Study Theorems in Automata Theory,
Proceedings of ITiCSE 98, Dublin, Ireland, August 1998, 302.
[37] Schmalhofer, F. and Glavonov, D. Three Components of Understanding a Programmer’s
Manual: Verbatim, Propositional and Situtational Representations. Journal of Memory and
Language, 1986, 25, 295-313.
[40] Tilbury, D., and Messner, W., Development and Integration of Web-based Software
Tutorials for an Undergraduate Curriculum: Control Tutorials for MATLAB, Proceedings
of FIE 97, Pittsburgh, PA, November 1997.
[41] Van Dijk, T.A. and Kintsch, W. Strategies of Discourse Comprehension. Academic
Publishers, New York, 1983.
[42] Wiedenbeck, S., Ramalingam, V., Sarasamma, S., and Corritore, C.L. A Comparison of
the Comprehension of Object-Oriented and Procedural Programs by Novice Programmers.
Interacting with Computers. 11(3): 255-282, 1999.