Calculus Machina
Calculus Machina
Douglas QUINNEY
University of Keele
Keele, Staffordshire, ST5 5BG, UK
e-mail: [email protected]
ABSTRACT
Students arriving at University are far from homogeneous and there is a growing need to assess their
active mathematical ability on entry to any course and provide suitable support materials when necessary.
This paper explores how emerging technologies can provide an environment for diagnostic testing and
follow up support material for such students. In particular, it discusses a new Computer Algebra System,
called Calculus Machina. Although many Computer Algebra Systems are excellent at "Doing" mathematics
they leave something to be desired when it comes to teaching and supporting learning in first year
undergraduate mathematics, as many of the intermediate steps involved with basic calculus are not revealed.
Calculus Machina is capable of solving many of the problems that arise in the standard Calc I and II
sequence, but also disclosing the steps and processes by which these results are obtained. Calculus Machina
can also function in tutorial mode where students are required to take an active role in learning, and where
the program can “look over the shoulder” of a student as the steps in a calculation are performed, checking
each step, and offer help when required. Finally, there is always a certain element of inertia when
considering the adoption of any new teaching material so we conclude this paper with an evaluation of
Calculus Machina in a teaching environment.
Keywords: Innovative Teaching, Technology, Computer Algebra Systems (CAS), Teaching Calculus,
Diagnostic Testing of mathematics skills.
1. Introduction
When students enter Higher Education courses in Science and Engineering, instructors
frequently have to make assumptions relating to their ability in a range of topic areas and
mathematical skills. (See Kitchen (1996), Hirst (1997), and Lawson (1997).) Such courses also
tend to recruit large numbers of students with a rich diversity of intake qualifications and prior
experiences. In addition, over the last decade the nature and background of the students who arrive
at our universities each September has changed markedly. The structure of a modular A-level
curriculum, the main entry vehicle for students in the U.K., and in particular Curriculum 2000, has
meant that students have a considerable range of mathematical experience and limited exposure to
mathematical ideas that were once taken for granted. (See Porkess (2001). Furthermore, there is
substantial evidence to suggest that schools are being selective in which A-level modules they opt
for in order to maximise the overall performance of the student cohort. As a consequence of all
these factors, students arriving at University are far from homogeneous. The need to assess
individual students on entry and assess their current active ability of students to any course is
crucial.
In a previous paper, one possible approach that uses technology for diagnostic testing and
follow up support was described. (See Quinney (2001)) This paper explores how emerging
technologies can provide support material for students at a time when they most need it and in a
form that may encourage them to become independent learners.
2. Diagnostic Testing
The need to provide suitable diagnostic testing of mathematical skills is taken for granted in a
wide variety of different institutions for two distinct but inter-related reasons.
(i) To provide students with useful individual feedback before problems escalate.
(ii) To provide teaching and tutorial staff with a global assessment of the current active
ability of each student on a chosen range of topics.
The aim of the test is not simply to return a numerical mark; its primary aim is to identify skills
that might be lacking. The test is designed to give partial credit by grading the skills that might
lead a student to select one of the incorrect answers and rewarding them accordingly. The student
can decide to abstain from a question; in which case they are not penalised for selecting a wrong
answer. However, such a decision indicates a deficiency of a particular skill and this is reflected in
the final diagnostic report. Each student’s responses are analysed to determine the student’s
capabilities in 10 distinct skills and the results are presented with a diagnostic screen as shown in
figure 2.2.
During the academic year 2000-2001, in an attempt to discover whether the diagnostic test
described above provides a realistic indicator of individual students’ capabilities, students were
asked to take both the diagnostic test and a written paper and the results compared. All 87 students
entered Principal Mathematics took both the diagnostic test and completed a written test that
involved a large number of problems involving differentiation at various levels of difficulty. A
statistical comparison of the written and diagnostic test showed that the scores are highly
correlated (r=0.75, p<0.001) and that a simple linear regression model accounts for 55% of the
variation of the marks. We conclude that the diagnostic test is a good predictor of individual an
individual student’s skills in differentiation. (See Quinney (2001).) This is significant, as the
reduction in workload required in using the automation provided by the CBL diagnostic test can be
significant, but more importantly because the CBL gave immediate feedback to each student.
A diagnostic test described above has been operating in the Mathematics Department at Keele
University during 1996-2001; figure 2.3 illustrates results of profile skills for the student cohort in
five successive years. The wide discrepancy, year by year, indicates that simply providing
common remedial courses will not be suitable. It seems appropriate, therefore, to look at the
microscopic scale and try to focus on individual students and attempt to assign each student
suitable support material. Providing individualised programmes of study using computer based
self-study programmes based on the results of the diagnostic test may provide a solution to this
problem.
The first of these reasons is by far the most attractive and the availability of a large bank of
reliable test problems can be extremely beneficial when coupled with immediate marking and
feedback.
Figure 4.2: Calculus Machina output revealing the steps in finding a derivative
Figure 4.3: Context sensitive help file – note that the example reflects the current problem being
solved.
Since Calculus Machina is able to differentiate almost all functions met in first and second year
mathematics and documents all the steps involved, it might be thought that this will encourage
students to take a very passive role and allow the computer to do the work. However, Calculus
Machina has a second, more educational, mode in which the student has to take a much more
active part in the process. This mode, called Udo, is illustrated in figure 4.4. Once again Calculus
Machina has been asked to differentiate sin(x2) but now the student has to supply the requisite
substitution which is then checked before they are permitted to proceed. In this mode Calculus
Machina can play the part of an individual tutor checking on each step and allowing students as
much practise, as they need.
Finally, the software includes the ability to generate further problems that are closely related to
the current problem to give further practice.
Figure 4.4: Calculus Machina in tutorial (Udo) mode
Table 5.1: Results of comparative trials using the Calculus Machina and Mathwise: Rules of
Differentiation.
It must be noted that a direct comparison between the Calculus Machina and Mathwise: Rules
Of Differentiation is a little unfair as they are several generations of software apart and the
Calculus Machina is designed specifically for the Calculus whereas Mathwise covers a wider
remit. Nevertheless, the mathematics department at Keele University has invested substantially in
its use of Mathwise and there is substantial inertia in changing to a new system, however, the
evidence of this study provides some credence for changing to Calculus Machina. A similar
experiment was conducted during the academic year 2001-2002 and the results were very similar.
The major advantage of the Calculus Machina is its ability to accept problems entered by the
student and disclose and document how the derivative or integral is found.
(2) Although the department has made use of several modules from Mathwise over the last 5
years and invested quite heavily in such materials there is sufficient evidence to show that
the capabilities of more recent software, Calculus Machina, are more beneficial.
Accordingly we aim to build it into the week that the Department has set aside for
developing the students’ skills in Introductory Calculus from the academic year 2002-
2003.
REFERENCES
- Anton H., Bivens I, & Davis S. (2002). Calculus, 7th Edition. John Wiley and Sons Inc, 0-471-38157-8.
- Beevers C.E., Bishop P, and Quinney D.A. (1998) Mathwise, diagnostic testing and Assessment,
Information Services & Uses, Volume 1, 1-15.
- Brydges S & Hibberd S, (1994) Construction and Implementation of a Computer-Based Diagnostic Test.
CTI Maths and Stats. 5/3 pp9-13.
- Hibberd S, Looms A. & Quinney D.A.. (2001) Computer based diagnostic testing and support in
mathematics, To appear in Innovative Teaching Ideas in
Mathematics, Ed Mohammad H. Ahmadi.
- Hughes Hallett, D, Gleason A, et al. (2002) Calculus. 3rd Edition, John Wiley and Sons Inc, 0-471-40827-
1
- Hirst K. (1997) Changes in A-Level mathematics from 1996. University of Southampton.
- Kitchen A. (1996) A-Level Mathematics isn't what it used to be; or is it? Mathematics Today. 32/5 pp. 87-
90.
- La Rose G. (2001). http://www.math.lsa.umich.edu/~glarose/projects/gateway/egrade.html
- Lawson D (1967). What can we expect of A-level mathematics students? Teaching Mathematics and its
Applications 16/4. pp 151-156.
- Lawson D, Halpin M & Croft A, (2001) After the diagnostic test – what next? Evaluating and Enhancing
the Effectiveness of Mathematics Support Centres, MSOR Connections, Volume 1, Number 3, pp 19-24.
- Porkess R, (2001) Mathematics in Curriculum 2000: What will students know? What will students not
know? MSOR Connections, Volume 1, Number 3, pp 35-39.
- Quinney D.A. (2001) Computer based diagnostic testing and student centred support
http://ltsn.mathstore.ac.uk/articles/maths-caa-series/nov2001/index.htm Maths CAA Series: Nov 2001.
LTSN Mathematics and Statistics Support Network.
- Wiley eGrade. http://jws-edcv.wiley.com/college/egrade/