0% found this document useful (0 votes)
39 views11 pages

Calculus Machina

Instruction Manual for the use of the Calculus machina for learning Calculus.

Uploaded by

Jorge Jordan
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
39 views11 pages

Calculus Machina

Instruction Manual for the use of the Calculus machina for learning Calculus.

Uploaded by

Jorge Jordan
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 11

CALCULUS MACHINA:

AN INTELLIGENT TUTOR PROVIDING COMPUTER BASED SUPPORT FOR


TEACHING UNDERGRADUATE CALCULUS.

Douglas QUINNEY
University of Keele
Keele, Staffordshire, ST5 5BG, UK
e-mail: [email protected]

ABSTRACT
Students arriving at University are far from homogeneous and there is a growing need to assess their
active mathematical ability on entry to any course and provide suitable support materials when necessary.
This paper explores how emerging technologies can provide an environment for diagnostic testing and
follow up support material for such students. In particular, it discusses a new Computer Algebra System,
called Calculus Machina. Although many Computer Algebra Systems are excellent at "Doing" mathematics
they leave something to be desired when it comes to teaching and supporting learning in first year
undergraduate mathematics, as many of the intermediate steps involved with basic calculus are not revealed.
Calculus Machina is capable of solving many of the problems that arise in the standard Calc I and II
sequence, but also disclosing the steps and processes by which these results are obtained. Calculus Machina
can also function in tutorial mode where students are required to take an active role in learning, and where
the program can “look over the shoulder” of a student as the steps in a calculation are performed, checking
each step, and offer help when required. Finally, there is always a certain element of inertia when
considering the adoption of any new teaching material so we conclude this paper with an evaluation of
Calculus Machina in a teaching environment.

Keywords: Innovative Teaching, Technology, Computer Algebra Systems (CAS), Teaching Calculus,
Diagnostic Testing of mathematics skills.
1. Introduction
When students enter Higher Education courses in Science and Engineering, instructors
frequently have to make assumptions relating to their ability in a range of topic areas and
mathematical skills. (See Kitchen (1996), Hirst (1997), and Lawson (1997).) Such courses also
tend to recruit large numbers of students with a rich diversity of intake qualifications and prior
experiences. In addition, over the last decade the nature and background of the students who arrive
at our universities each September has changed markedly. The structure of a modular A-level
curriculum, the main entry vehicle for students in the U.K., and in particular Curriculum 2000, has
meant that students have a considerable range of mathematical experience and limited exposure to
mathematical ideas that were once taken for granted. (See Porkess (2001). Furthermore, there is
substantial evidence to suggest that schools are being selective in which A-level modules they opt
for in order to maximise the overall performance of the student cohort. As a consequence of all
these factors, students arriving at University are far from homogeneous. The need to assess
individual students on entry and assess their current active ability of students to any course is
crucial.
In a previous paper, one possible approach that uses technology for diagnostic testing and
follow up support was described. (See Quinney (2001)) This paper explores how emerging
technologies can provide support material for students at a time when they most need it and in a
form that may encourage them to become independent learners.

2. Diagnostic Testing
The need to provide suitable diagnostic testing of mathematical skills is taken for granted in a
wide variety of different institutions for two distinct but inter-related reasons.

(i) To provide students with useful individual feedback before problems escalate.
(ii) To provide teaching and tutorial staff with a global assessment of the current active
ability of each student on a chosen range of topics.

The Heads of Departments of Mathematical Sciences in the UK (HoDoMS) funded a WWW


site giving information, contacts and case studies of existing diagnostic tests.
http://www.keele.ac.uk/depts/ma/diagnostic/ in 1996. This site contains links to the diagnostic
tests used at a number of universities and a selection of case studies which give details of how
diagnostic testing is carried out and, just as importantly, how students are supported thereafter.
Diagnostic testing is now being introduced in many universities, some use paper-based tests
that are frequently optically marked to minimise the staff overheads, others have opted for
computer-based testing often in the form of Multiple Choice Questions (MCQs). MCQs are
attractive to those looking for a way of assessing students arising from their ease of marking by
providing a computer-based form of assessment. (See Brydges & Hibberd (1994) and Beevers,
Bishop & Quinney (1998).) At Keele University we have used a MCQ diagnostic test for a
number of years in order to identify any students who may need particular attention. The test
consists of 20 MCQs selected randomly from a bank of about 50 questions each of which is
randomised. A typical question is shown in figure 2.1.
Figure 2.1: Sample Question

The aim of the test is not simply to return a numerical mark; its primary aim is to identify skills
that might be lacking. The test is designed to give partial credit by grading the skills that might
lead a student to select one of the incorrect answers and rewarding them accordingly. The student
can decide to abstain from a question; in which case they are not penalised for selecting a wrong
answer. However, such a decision indicates a deficiency of a particular skill and this is reflected in
the final diagnostic report. Each student’s responses are analysed to determine the student’s
capabilities in 10 distinct skills and the results are presented with a diagnostic screen as shown in
figure 2.2.

Figures 2.2: Student Diagnostic Report

During the academic year 2000-2001, in an attempt to discover whether the diagnostic test
described above provides a realistic indicator of individual students’ capabilities, students were
asked to take both the diagnostic test and a written paper and the results compared. All 87 students
entered Principal Mathematics took both the diagnostic test and completed a written test that
involved a large number of problems involving differentiation at various levels of difficulty. A
statistical comparison of the written and diagnostic test showed that the scores are highly
correlated (r=0.75, p<0.001) and that a simple linear regression model accounts for 55% of the
variation of the marks. We conclude that the diagnostic test is a good predictor of individual an
individual student’s skills in differentiation. (See Quinney (2001).) This is significant, as the
reduction in workload required in using the automation provided by the CBL diagnostic test can be
significant, but more importantly because the CBL gave immediate feedback to each student.
A diagnostic test described above has been operating in the Mathematics Department at Keele
University during 1996-2001; figure 2.3 illustrates results of profile skills for the student cohort in
five successive years. The wide discrepancy, year by year, indicates that simply providing
common remedial courses will not be suitable. It seems appropriate, therefore, to look at the
microscopic scale and try to focus on individual students and attempt to assign each student
suitable support material. Providing individualised programmes of study using computer based
self-study programmes based on the results of the diagnostic test may provide a solution to this
problem.

Figures 2.3: Cohort Profiles 1997-2000


The results of the diagnostic test between 1996 & 2000 were sufficiently encouraging that it
was decided to integrate the process of diagnosis and support into the first year programme. The
response from students has been exceptionally positive, in that the students have requested similar
material to extend the diagnostic process to consider integration in more detail.

3. Online Web Support


At the end of the diagnostic test students were asked to reflect on the result to see if they
considered it fair. Many did and excused their poor performance on the grounds that it was several
months, over the summer vacation, since they had actually done any mathematics. In order to
remedy this in future years, students that have been accepted onto the course at Keele will be given
access to WWW-based mathematical quizzes that will enable them to hone up their skills before
they arrive at university.
There are a large number of WWW based tutorial systems currently available but we shall be
encouraging students to use eGrade. (Published by John Wiley (2002).) This system provides a
large number of prepared tests but in addition it gives the facilities for instructors to enter their
own questions and manage the delivery of both quantitative and technical problems. The questions
can be either multiple choice or free text and the software provides facilities for students to
preview answers in “pretty print”, i.e. mathematical layout. eGrade system has been class-tested
for several years at the University of Michigan where in excess of 8000 students have used the
system. (See LaRose (2001).) Students can access banks of problem sets and view example
problems, which are integrated with some of the better-known texts. The software provides
immediate scoring of student work and individualized feedback.
The advantages of such WWW based systems are manifold.
(a) Students can practice their skills and enhance their confidence prior to any formal
testing.
(b) The questions are available anywhere and anytime and are therefore more
attractive to a generation of students who delight in the availability of the WWW.
(c) The performance of individual students can be tracked and analyzed, though in
some cases the latter can be a deterrent if students believe their every mistake is being
recorded.

The first of these reasons is by far the most attractive and the availability of a large bank of
reliable test problems can be extremely beneficial when coupled with immediate marking and
feedback.

4. Computer based support material


Gains made from the implementation of diagnostic testing or the provision of on-line
preparatory quizzes is limited without providing suitable learning support material. Such support
materials needs to be tailored to each student’s individual needs and yet cover the broad range of
core mathematical knowledge at this level. This can be accomplished through human tutors, drop-
in clinics, supplementary lectures, and mathematics resource centres, etc. (Lawson, Halpin and
Croft, (2001).) However, experience has shown that even though the weaknesses of individual
students can be detected using diagnostic testing the restrictions of individual and teaching
timetables make it difficult to allot specific times when students can be supervised to ensure that
any remedial work is carried out.
During 1996-1999 the mathematics department at Keele University pioneered the use of the
TLTP material, Mathwise, to provide individual study profiles which were automatically allocated
following the diagnostic test. (Hibberd, Looms & Quinney (2001)). However, many students are
becoming familiar with computer algebra systems (CAS) such as Mathematica, Maple, Derive,
etc. Although these systems are excellent at “Doing” mathematics they leave much to be desired
for teaching and learning mathematics. To this end we have been investigating the use of a CAS
system that concentrates on teaching and learning, and how such a system can be integrated to
provide the student support needed to follow up a diagnostic test.
A new software package called Calculus Machina has been developed, which has been
designed to have a full range of computer algebraic skills in basic calculus but is also capable of
revealing the steps that are required to evaluate derivatives and integrals. Furthermore, the
interface between the student and software has been designed to be as simple as possible and yet
remain very versatile. Students are able to type in their own expressions and see them displayed
immediately in a “pretty print” form, or select and edit the current expression using “point and
click”. Alternatively, mathematical expressions can be entered using simple templates. (See figure
4.1.)

Figure 4.1: Calculus Machina’s input tool


Once a function has been defined the software will either display the steps required to
determine the derivative, as shown in figure 4.2. In figure 4.3, the Calculus Machina has been
asked to differentiate sin(x2). Notice that it recognises that it is necessary to use the Chain Rule
(flagged by the text Derivative of Composite Function) and then reveals the steps needed to
continue. These flags also provide a hypertext link to context sensitive help that allow the student
to “drill down” and gain additional help as shown in figure 4.4. These pages are derived from
“Calculus”, Hughes Hallett, et al (2002) or the “Calculus”, Anton (2002). Future versions of the
software will enable an instructor to add links to alternative texts and additional material. The
advantage with Calculus Machina is the ability for the students to type in their own problems or
for it to generate practise problems for the student to attempt to re-enforce their skills in this topic.

Figure 4.2: Calculus Machina output revealing the steps in finding a derivative
Figure 4.3: Context sensitive help file – note that the example reflects the current problem being
solved.

Since Calculus Machina is able to differentiate almost all functions met in first and second year
mathematics and documents all the steps involved, it might be thought that this will encourage
students to take a very passive role and allow the computer to do the work. However, Calculus
Machina has a second, more educational, mode in which the student has to take a much more
active part in the process. This mode, called Udo, is illustrated in figure 4.4. Once again Calculus
Machina has been asked to differentiate sin(x2) but now the student has to supply the requisite
substitution which is then checked before they are permitted to proceed. In this mode Calculus
Machina can play the part of an individual tutor checking on each step and allowing students as
much practise, as they need.
Finally, the software includes the ability to generate further problems that are closely related to
the current problem to give further practice.
Figure 4.4: Calculus Machina in tutorial (Udo) mode

5. A Case Study 2000-2001


To investigate the effectiveness of the Calculus Machina, the students studying Principal
Mathematics at Keele University during the academic year 2000-2001, were divided into two
groups. Those scoring in excess of 65%, on the diagnostic test, were asked to look at a Mathwise
Module called Applications of Mathematics. (See Beevers et al, 1998). The remaining students
were further randomly sub-divided into two further groups (B1 and B2). Group B1 was asked to
study a Mathwise Module: Rules of Differentiation and Group B2 was asked to use Calculus
Machina. The aim of the project was to compare the performance of groups B1 and B2 to see if
there was any statistical difference in performance of the two groups. To do this Groups B1 and
B2 were asked to retake the diagnostic test at the end of their study and also complete a paper-
based questionnaire.
5.1 Results
28 students completed the pre and post-diagnostic test though somewhat fewer also completed
questionnaire. The students in Group B1 had a mean baseline score of 49.53 whilst those in Group
B1 scored slightly less, 43.3 though this difference was not statistically significant, (p=0.23 using a
t-test). 2 students in Group B2 were not included in the analysis, as they would have skewed the
result even further in favour of the Calculus Machina. To investigate the effectiveness of the
packages allocated to the two groups the mean paired absolute differences of the two groups were
analysed.
The results of this trial are given in Table 5.1, and suggest that Group B2 have improved
significantly better that Group B1 (p=0.005) even though their pre-test score was slightly poorer.
Analysing the relative improvement in diagnostic score after using the software gives a similar
result. Even though there is substantial variation in the results observed and the sample sizes are
relatively small we can conclude that, based on these results, the Calculus Machina appears to be
the more effective software when used in this context.

Group Number Software Pre-test SD Mean SD


score Difference
B1 13 Mathwise 49.53 14.61 5.38 10.39
B2 13 Machina 43.30 10.94 22.4 17.02

Table 5.1: Results of comparative trials using the Calculus Machina and Mathwise: Rules of
Differentiation.

It must be noted that a direct comparison between the Calculus Machina and Mathwise: Rules
Of Differentiation is a little unfair as they are several generations of software apart and the
Calculus Machina is designed specifically for the Calculus whereas Mathwise covers a wider
remit. Nevertheless, the mathematics department at Keele University has invested substantially in
its use of Mathwise and there is substantial inertia in changing to a new system, however, the
evidence of this study provides some credence for changing to Calculus Machina. A similar
experiment was conducted during the academic year 2001-2002 and the results were very similar.
The major advantage of the Calculus Machina is its ability to accept problems entered by the
student and disclose and document how the derivative or integral is found.

5.2 Questionnaire Results


18 completed questionnaires were returned; 9 from Group B1 and 9 from Group B2.
Respondents reported a wide range of reasons for studying Mathematics or Statistics and a wide
variety of topics in which they had perceived strengths and weaknesses. Most of the students
regarded the diagnostic test as accurate. Students varied widely in their attitudes to the use of
computers in teaching and learning. Some appreciated the fact that the computer allows them to
work at their own pace, provides instant feedback, and was able to lead them step-by-step through
methods; others found the experience somewhat stressful. A similar questionnaire in 2002 found
fewer students in the latter category; further investigation has shown that, as might be expected,
students are becoming more acclimatised to using courseware.
6. Conclusion
Courseware is now available to help detect areas of mathematical weakness at individual
student level, provide individual testing at the convenience of the student and provide
individualised support. In particular we have shown:
(1) That the simple diagnostic test that we have used is a good predictor of student
performance and may thus be used to support differentiated teaching. Although
discussions with course tutorial support staff are vital, the computer-based profiles provide
a pro-active mechanism for the early identification of student weaknesses. Of course, the
basis of this paradigm is dependent on the development of study skills by individual
students and the inclusion of both summative and formative assessment can help re-
enforce this. The same software can also be used to gather information on the cohort as a
whole and also to track the performance of students on a year-by-year basis.

(2) Although the department has made use of several modules from Mathwise over the last 5
years and invested quite heavily in such materials there is sufficient evidence to show that
the capabilities of more recent software, Calculus Machina, are more beneficial.
Accordingly we aim to build it into the week that the Department has set aside for
developing the students’ skills in Introductory Calculus from the academic year 2002-
2003.

REFERENCES
- Anton H., Bivens I, & Davis S. (2002). Calculus, 7th Edition. John Wiley and Sons Inc, 0-471-38157-8.
- Beevers C.E., Bishop P, and Quinney D.A. (1998) Mathwise, diagnostic testing and Assessment,
Information Services & Uses, Volume 1, 1-15.
- Brydges S & Hibberd S, (1994) Construction and Implementation of a Computer-Based Diagnostic Test.
CTI Maths and Stats. 5/3 pp9-13.
- Hibberd S, Looms A. & Quinney D.A.. (2001) Computer based diagnostic testing and support in
mathematics, To appear in Innovative Teaching Ideas in
Mathematics, Ed Mohammad H. Ahmadi.
- Hughes Hallett, D, Gleason A, et al. (2002) Calculus. 3rd Edition, John Wiley and Sons Inc, 0-471-40827-
1
- Hirst K. (1997) Changes in A-Level mathematics from 1996. University of Southampton.
- Kitchen A. (1996) A-Level Mathematics isn't what it used to be; or is it? Mathematics Today. 32/5 pp. 87-
90.
- La Rose G. (2001). http://www.math.lsa.umich.edu/~glarose/projects/gateway/egrade.html
- Lawson D (1967). What can we expect of A-level mathematics students? Teaching Mathematics and its
Applications 16/4. pp 151-156.
- Lawson D, Halpin M & Croft A, (2001) After the diagnostic test – what next? Evaluating and Enhancing
the Effectiveness of Mathematics Support Centres, MSOR Connections, Volume 1, Number 3, pp 19-24.
- Porkess R, (2001) Mathematics in Curriculum 2000: What will students know? What will students not
know? MSOR Connections, Volume 1, Number 3, pp 35-39.
- Quinney D.A. (2001) Computer based diagnostic testing and student centred support
http://ltsn.mathstore.ac.uk/articles/maths-caa-series/nov2001/index.htm Maths CAA Series: Nov 2001.
LTSN Mathematics and Statistics Support Network.
- Wiley eGrade. http://jws-edcv.wiley.com/college/egrade/

You might also like