2015 Physics Chief Assessor's Report
2015 Physics Chief Assessor's Report
2015 Physics Chief Assessor's Report
Overview
Chief Assessors’ reports give an overview of how students performed in their school
and external assessments in relation to the learning requirements, assessment
design criteria, and performance standards set out in the relevant subject outline.
They provide information and advice regarding the assessment types, the application
of the performance standards in school and external assessments, the quality of
student performance, and any relevant statistical information.
School Assessment
The best tasks provided a variety of opportunities for students to achieve against the
performance standards by assessing them on more than one occasion. The better
tasks also allowed students to demonstrate their strengths by being flexible so that
different students could approach the task in different ways. Students were limited in
their responses when, for example, all students in a class completed the same
design practical investigation or answered the same issues investigation question.
The best design practicals and issues investigations were not over-structured but
were designed in a way that students were able to show initiative.
The vast majority of teachers followed the subject outline by having students focus
on an issue in their issues investigation, rather than an application. These teachers
provided students with the opportunity to address, in particular, specific feature AE1.
When a question is generated and answered by the student, the student can more
fully address the specific feature.
It is helpful when moderators can easily see that assessment conditions are clearly
indicated and followed, with thought given about how to balance the need for
teachers to verify student work with the need to give students adequate time to
produce work at the required standard. Moderators also need to be clear what advice
and support were provided to students prior to them commencing the task, especially
A variety of strategies were used to successfully enable students to comply with the
word-limit in the issues investigation. Efficient ways of evaluating the bias, credibility,
accuracy, and suitability of the sources enabled students to devote the majority of
allowed word-count to:
Most teachers provided students with tasks that had an appropriate amount of
scaffolding. This gives students sufficient guidance to meet the performance
standards and also allows them to demonstrate their achievement at higher grade
bands. A task that is too scaffolded removes the opportunity for a capable student to
demonstrate that the student ‘designs logical, coherent, and detailed physics
investigations’, ‘critically and systematically analyses data … to formulate logical and
perceptive conclusions’, and ‘critically and logically evaluates procedures and
suggests a range of appropriate improvements’ (as in the performance standards).
The nature of some tasks and the scaffolding provided limited some classes to a
maximum B grade or even a C grade.
Sets of skills and applications tasks were almost exclusively topic tests. In most
cases, teachers made good use of a variety of previous examination questions or
questions of a similar style. They generally made it easier for moderators to confirm
their grades by clearly identifying the specific features assessed in each task. This
was done either by annotating each question with the appropriate specific feature or
by providing a summary of student achievement against the features on the front
Most tasks included a mixture of questions with a variety of demands. They allowed
all students to access some questions and demonstrate their knowledge,
understanding, and skills. They also allowed some students to demonstrate their
ability to work in the A grade band. Some teachers set tasks that did not assess
experimental skills, limiting the achievement in the investigation, and analysis and
evaluation assessment design criteria. Some tasks also limited the ability of students
to give extended answers in which they could describe and explain physics concepts,
phenomena, and applications necessary to achieve in the high grade bands for
knowledge and understanding.
Although there is no requirement in the subject outline for the SATs to assess all
topics, almost all teachers did so, allowing students to show breadth as well as depth
of knowledge and understanding.
Again this year, some teachers set more than five tasks in this assessment type,
which results in the total number of assessments across the subject being greater
than the maximum of ten permitted in the subject outline. Chief Assessor’s reports
have advised against setting more than five SATs in previous years. It should be
noted that dividing a task that has been described as a single task in an approved
learning and assessment plan (LAP) so that the task is completed at two separate
sittings changes this task from one to two separate tasks. As such, this practice is
likely to lead to the total number of tasks in the LAP exceeding the maximum number
permitted by the subject outline and hence the LAP would not be approved. In effect,
then, these teachers provided student work for moderation that was contrary to their
approved LAP. In much the same way as word-count is dealt with in the issues
investigation, moderators had to consider only the number of SATs present in the
approved LAP. Teachers who follow this practice risk their students having limited
opportunity to show the breadth and depth of their knowledge and understanding, as
well as their ability to use terminology and solve problems.
Most teachers provided their students with topic tests to assess them against the
performance standards. This is preferable to using mid-year or trial examinations as
SATs, where the demands of the task at that time of the year tend to limit student
achievement against the performance standards.
External Assessment
Questions asked in different contexts often showed that students had rote-learned
specific responses and could not adapt them to the given circumstances.
Question 1
To obtain the first 2 marks of the examination, the students were required to show
knowledge that the accelerations of the two projectiles are equal in size, and directed
straight downwards. While most students were able to show the required knowledge,
disappointingly, too many students were unable to distinguish between acceleration
and velocity, with many students drawing tangential velocity vectors and their
horizontal and vertical components (rarely with appropriate labels).
The second part of the question used the knowledge that complementary angles
would result in the same range. Students who did not know this often guessed 45° or
attempted to measure the angle from the diagram.
Question 2
Question 2 was the most successfully answered question in the paper. The
calculation provided little challenge, nor did drawing the vectors. Failure to label at
least one vector resulted in students losing a mark that they should easily have been
able to gain.
Question 3
Both the explanation and calculation were completed well by many students. Often
the best explanations were done by students who used a diagram within their
answer, which showed that the horizontal component of the normal force was
directed towards the centre of the circle — the information most commonly missed
from the explanations.
Question 4
This question contained two routine calculations, which students were usually able to
complete well. A common misconception in part (b) was that the speed of the satellite
depends upon its mass rather than the mass of the Earth. Incorrect answers to
part (c) usually had correct physics (such as launching west to east, or that the
centre of orbit corresponds to the centre of the Earth), but that information did not
answer the question.
Question 5
A failure to correctly draw the vector diagram led to this question being the worst
answered in the examination. Students who showed additional working before
drawing the vector diagram (such as ) were able to easily
answer the question correctly.
Question 7
The rearranging of Coulomb’s law proved problematic for many students (particularly
dealing with the constant), often resulting in unrealistic answers. Answers with three
or more significant figures were penalised in this question — however, many
students used the ‘clue’ in the question (distance is ) to correctly decide
how many significant figures to include.
For the second part of this question, many students determined that the force was
one-quarter as large, but failed to properly communicate how they used the
proportionality to obtain this result.
Question 8
Most students had knowledge of the direction and shape of the electric field, but
students often lost marks due to a lack of care or through rushing. Markers penalised
answers that did not clearly show an understanding that the vectors touch the
surface of the conductor, and that the vectors should be perpendicular to the surface
of the conductor.
Question 9
Question 10
Students’ inability in part (a) to describe the uniform circular motion of a charge in a
magnetic field is frustrating, because the three required points are well known and
because this concept has been frequently assessed in recent examinations.
A common misconception was that the particle’s motion changes ‘due to the right
hand rule’. The remaining parts of the question were handled well by the majority of
students.
Question 11
The first two parts of the question were successfully completed by many students.
The vague answer to part (a) of ‘up’ could not be rewarded, and markers commented
that ‘too many students said into the page’. The correct abbreviation of ‘seconds’ is
‘s’, not ‘secs’.
Part (c) showed many students’ inability to communicate their deductions — many
determined that the frequency would increase but were unable to give a satisfactory
Question 13
For one of the applications prescribed in the course, it was very surprising that 12%
of students did not provide an answer about the operation of a loudspeaker, and that
20% provided an answer that received no marks. It was common for students to
discuss the compressions and rarefactions of a sound wave without referring to the
magnetic interactions that cause the movement of the cone. The student responses
to this question were the most disappointing of the examination for the examiners.
Question 14
Explanation questions involving 3 marks are usually very challenging for students,
and Question 14 was no different. The question required knowledge of the production
of light through incandescence, and the concepts of coherence and
monochromaticity. The majority of students were unable to explain and/or link these
concepts (with coherence being surprisingly poorly understood).
Question 15
Question 15 was a positive way for most students to finish Booklet 1. In part (c),
there were many examples of poor communication, particularly the vague use of
terminology such as ‘the waves are in phase’.
Question 16
The calculation in part (a) was not a problem for most students, although some used
in their calculations (possibly from seeing five maxima in the image). The failure
by students to correctly round off their answer (taking 1.2266 from the calculator and
writing 1.22) demonstrates a lack of numeracy skills that is unexpected in Stage 2
Physics.
Knowing that blue light has a smaller wavelength than red light allowed many
students to successfully describe how the different wavelength would affect the angle
of the first-order maxima. The misconception that was evident in a number of
student answers. Most students could talk about how a change in wavelength would
affect the angle, even if they did not know how the wavelength was changed.
Question 17
Question 17 was a routine calculation, which the majority of students did correctly.
Question 18
The most successful way to answer part (a) was by using the terms ‘work function’
and ‘threshold frequency’. Where students attempted to describe these (rather than
just state them), their communication often lacked clarity.
Many answers to part (b) suggested that students had not analysed graphs of data
from photoelectric effect experiments. It was common for students to rearrange the
given equation for , making no reference to the graph (or its gradient).
Question 19
The 3-mark explanation for part (a) was fully completed by less than 7% of students.
Many struggled to use the law of conservation of energy to discuss the energy
change into X-ray photons when the electrons strike the target, and the link between
frequency and energy was often omitted. Many students made the connection
between the energy of the emitted photon and the electron’s proximity to nuclei in the
target material, but few mentioned that the remaining energy is converted to heat.
The rearrangement and calculation in part (b) was done well by the majority of
students, although many answers were given as 58012.5 V, and this inappropriate
number of significant figures was penalised.
Question 20
Part (a) was considered sufficiently routine that the result could be used in parts (b)
and (c). Most students did the calculation correctly, but often the conversion of
478 nm into metres was omitted or the conversion to eV was done incorrectly,
leading to unrealistic answers. Some students used the wavelength as the frequency
in the equation again yielding an unrealistic answer. The knowledge that only
transitions upwards from the ground state of hydrogen will occur at room temperature
was rarely shown or rarely communicated well in part (c). Questions that instruct
students to draw and label require students to draw and label to gain full credit.
Question 21
It is clear from this question that many students do not know what fluorescence is, or
do not understand the process to a satisfactory standard. About 14% of students did
not attempt this question, and a further 25% attempted it, but received no marks.
Question 22
Question 24
As expected, a number of students confused the role of control rods with the role of
moderators. The best answers referred to controls rods controlling the rate of
reactions through the absorption of neutrons. Half-life calculations are usually done
well by the majority of students, and that pattern continued in 2015. There was
improvement over previous years in the way that students communicated their
problem-solving method in determining the time for the activity to drop.
Question 25
A surprising number of students spelled ‘fusion’ as ‘fussion’, and this mistake was
penalised. The problem-solving required to determine the mass of the products from
the energy released was challenging for most students. In part (c), the misconception
that fusion requires energy to overcome the nuclear force was common. Rarely were
the key properties of the nuclear force stated clearly, so applying the ideas in an
unfamiliar context was very challenging for students.
Question 26
There were many students who completed the data table in part (a) with answers to
an incorrect number of significant figures. The graphs drawn in part (b) were usually
done well, with the most common loss of marks being for unsuitable scales on the
vertical axis and for incorrect plotting (which was often a result of an unsuitable scale
on the vertical axis). The students who calculated a gradient did it well, but the
gradient’s units were often incorrect or omitted. The use of the gradient is a skill that
students find challenging, with only the most capable students being able to
successfully determine an answer and communicate their reasoning. Some students
were able to show their understanding of ‘accuracy’ in part (e) despite having failed
to complete part (d), and were appropriately rewarded. Many students demonstrated
some knowledge of how the magnetic force depends upon the angle between the
current element and the magnetic field, but the link to the results of the experiment
was not communicated well.
Question 27
The marking of both of the dot points for Question 27 focused on seeking three
content points for each. In the first dot point there were two common approaches.
The approach of students describing the acceleration being caused by the force was
generally done better than the approach of discussing the energy gained by the work
done as the ions move across the potential difference; however both approaches
yielded some very good answers. The best answers were concise, with a logical
progression of ideas. A number of students derived an expression for the
The best answers to the second dot point correctly showed how the law of
conservation of momentum applied to the context, rather than just stating it. An
approach of equating the total initial momentum with the total final momentum
yielded the best results. Despite many students failing to link the increased
momentum of the spacecraft with a gain of speed, the marks for the second dot point
were better than for the first.
The average mark for Question 27 was higher than the average mark for most
extended-response questions in the past few years. Also pleasing was that the
number of students presenting an answer (92%) was higher than in recent history.
Question 28
The need for students to identify a variable and then design an experiment was
different from many extended-response questions in the past, but this did not prevent
students from answering it. In the same way as for Question 27, the number of
students presenting an answer for Question 28 was higher than the number of
answers for the final questions of recent examinations.
When suggesting variables, a significant number of students stated that the equation
would apply, whereas better answers suggested that it may apply. Other
good answers focused on the equation , known to apply to two slits and to
diffraction gratings, so likely to apply to three slits.
The marking of this question focused on the students’ ability to devise and
communicate an effective procedure. There were many additional ways that students
could obtain marks, including correctly identifying and justifying independent,
dependent, and controlled variables; stating a hypothesis which contained an
anticipated proportionality; and describing how the data could be analysed
graphically. Discussions of safe use of lasers, particularly the requirement that the
room be well lit (as described in the Australian Government’s Safety Guide for the
Use of Radiation in Schools), was rewarded. Good practical skills, such as repeating
measurements and averaging or measuring across a number of bright bands (and
dividing appropriately), earned students marks.
Disappointing was the number of students who did not understand a ‘variable that
could affect the pattern’, erroneously discussing things like the ambient light in the
room. Similarly disappointing was the number of students who did not design an
experiment for a variable, but who described an experiment they (probably) had done
during the year that uses different diffraction gratings to determine the wavelength of
a laser.
Teachers should ensure that there is a copy of the approved LAP, including an
addendum where appropriate. There should also be a Variations — Moderation
Materials form for all instances where student work in the moderation sample differs
from that specified in the LAP (including the addendum).
A teacher pack containing a copy of the approved LAP and copies of each of the
tasks, showing what was provided for students, should be included. Teachers should
also ensure that the process that they used to determine the student grades is clear
to the moderators, allowing them to review the work and confirm the grade.
Originals rather than photocopies of student work should be provided, with teacher
annotations in a different colour ink to student work. Hard copies of student work
should be provided where possible, rather than electronic copies of text documents.
Physics
Chief Assessor