0% found this document useful (0 votes)
152 views35 pages

2018 NAPLAN Year 9 State Report With Answers

The NAPLAN 2018 State report for Year 9 provides an analysis of Queensland students' performance in literacy and numeracy assessments, including detailed item performance, state and national comparisons, and implications for teaching. It serves as a resource for educators, principals, and parents to understand student outcomes and informs educational practices. The report includes recommendations for using NAPLAN data to enhance teaching and learning strategies within schools.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
152 views35 pages

2018 NAPLAN Year 9 State Report With Answers

The NAPLAN 2018 State report for Year 9 provides an analysis of Queensland students' performance in literacy and numeracy assessments, including detailed item performance, state and national comparisons, and implications for teaching. It serves as a resource for educators, principals, and parents to understand student outcomes and informs educational practices. The report includes recommendations for using NAPLAN data to enhance teaching and learning strategies within schools.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 35

NAPLAN

2018 State report – Year 9


February 2019
© The State of Queensland (Queensland Curriculum & Assessment Authority) 2019
Queensland Curriculum & Assessment Authority
PO Box 307 Spring Hill QLD 4004 Australia
154 Melbourne Street, South Brisbane
Phone: (07) 3864 0299
Email: [email protected]
Website: www.qcaa.qld.edu.au
Contents
Preface ______________________________________________ 1
Who should use this State report? ............................................................................ 1
About the tests .......................................................................................................... 1
Marking and scoring the tests ................................................................................... 2
Using NAPLAN reports to inform teaching and learning ........................................... 2
Other NAPLAN reports ............................................................................................. 4

Literacy______________________________________________ 6
Writing ................................................................................................................ 6
Stimulus (writing prompt) Years 7 & 9 ....................................................................... 6
About the task ........................................................................................................... 6
Performance ............................................................................................................. 7
Sample script ............................................................................................................ 9
Commentary on sample script ................................................................................ 11
Language conventions ..................................................................................... 12
Spelling .............................................................................................................. 12
Results and item descriptions .............................................................................. 12
About the test ....................................................................................................... 13
Performance ........................................................................................................ 13
Implications for teaching ...................................................................................... 14
Grammar and punctuation ................................................................................. 16
Results and item descriptions .............................................................................. 16
About the test ....................................................................................................... 17
Performance ........................................................................................................ 18
Implications for teaching ...................................................................................... 18
Reading ............................................................................................................ 19
Results and item descriptions ................................................................................. 19
About the test .......................................................................................................... 21
Performance ........................................................................................................... 22
Implications for teaching ......................................................................................... 24

Numeracy ___________________________________________ 26
Results and item descriptions ................................................................................. 26
About the test .......................................................................................................... 28
Performance ........................................................................................................... 29
Implications for teaching ......................................................................................... 29
Preface
State reports are issued by the QCAA about the performance of Queensland students on the
National Assessment Program — Literacy and Numeracy (NAPLAN) paper tests. State reports
provide system-level information and are publicly available. This report for Year 9 students in
2018 contains:
• the Queensland performance on each item
• the national performance on each item
• the item descriptors
• a commentary on the state results
• some recommendations for teaching.

Who should use this State report?


The NAPLAN State reports help principals, teachers and other school personnel understand,
interpret and use information about student performance on NAPLAN.
School principals can use this report to provide information to the school community on aspects
of the tests. This would allow professional conversations with their teachers, curriculum leaders,
and department heads. Curriculum leaders can use this information to interpret the more specific
information given in their school and class reports. These other reports are explained below.
Since this report is publicly available on the QCAA website, it can also inform providers of teacher
training, special education services and educational research and policy.
Parents and carers can use this report to interpret the results on their child’s student report. They
are also able to judge how their child performed when compared with the whole population of
students. The item descriptors provide them with useful information about the scope of the tests.

About the tests


The purpose of the National Assessment Program (NAP) is to collect information that
governments, education authorities and schools can use to identify indicative literacy and
numeracy skills Australian students can demonstrate. As part of that program, the NAPLAN tests
are administered to full cohorts of students in Years 3, 5, 7 and 9 each year. These standardised
tests are sources of information about student learning that can be used to inform educational
policy and current educational practice.
The NAPLAN tests were initially developed using the nationally agreed Statements of Learning
for English and Statements of Learning for Mathematics, 2005. Since 2016 however, the tests are
referenced to the Australian Curriculum. The NAPLAN tests are designed to assess student
understanding in the following areas:
 Language conventions: The test assesses the ability of students to independently
recognise and use correct Standard Australian English grammar, punctuation and spelling.
 Writing: The test assesses the ability of students to convey thoughts, ideas and information
through the independent construction of a written text in Standard Australian English.
 Reading: The test assesses the ability of students to independently make meaning from
written Standard Australian English texts, including those with some visual elements.

NAPLAN Queensland Curriculum & Assessment Authority


2018 State report – Year 9 February 2019
Page 1 of 32
 Numeracy: The test assesses students’ knowledge of mathematics, their ability to apply
that knowledge in context independently, and their ability to reason mathematically.

Marking and scoring the tests

Marking the tests


Markers mark those test items that do not use a multiple-choice format. These markers apply
nationally agreed marking guides. There are marking guides for open-ended Reading items if any
such items are included. Marking guides allow consistent and reliable judgments by markers.
There are guides for the Writing test and one each for the constructed responses in Numeracy
and Spelling. For some Numeracy items, students may provide a correct response in different
forms. Professional officers decide on agreed scoring protocols for these items.

Calculating raw scores


The simplest calculation made in scoring the tests is the raw score — the number of questions
answered correctly. Each of the questions for the Language conventions, Reading and Numeracy
tests are marked as either correct or incorrect. Raw scores for the Writing test are sums of the
marks on each of ten criteria.
Raw scores have limited use. They enable the performance of students who have all completed
the same test at the same time to be placed in a rank order, but they do not provide information
about the level of difficulty of the test nor the relative differences between students.

Constructing scaled scores and bands


To make raw scores more useful, they are transferred to scores on a common scale that reflects
how difficult it was to achieve each score. Each year ACARA publishes equivalence tables that
allow a student’s raw score to be located on the NAPLAN scale. The scale is comparable
between year levels for each assessment area. An equating process is also carried out on each
year’s test to enable scores to be compared between successive years of testing. For example, a
raw score of 20 on the Year 3 Reading test might be transformed to a scaled score of 354. This
will also represent the same achievement for a student with the same scaled score in Year 5, and
for a student with the same scaled score for Reading in a previous year.
Each NAPLAN scale is divided into ten bands used to report student progress.

Using NAPLAN reports to inform teaching and learning

Using scaled scores


The scaled score can compare the results of different students. Scaled scores provide a basis for
measuring and comparing students’ abilities across years of schooling, for example comparing a
student’s result in Year 3 in 2016 and Year 5 in 2018. The scales can thus help to monitor the
growth of groups of students over time. This enables the school to review and/or consolidate
special programs that may have been put in place.
Principals and teachers should take care when making comparisons between small groups of
students. For groups of fewer than ten students, differences may not be reliable, particularly small
differences.

Using item analysis


While the national and state reports provide the comparative data, class reports provide a school
with the information that can be used to inform teaching and learning and to build capacity in
schools. Analysis of the NAPLAN class data, in particular the performance on each item, will

NAPLAN Queensland Curriculum & Assessment Authority


2018 State report – Year 9 February 2019
Page 2 of 32
provide teachers with information about the understandings and patterns of misunderstandings of
students.
Looking at the performance on the items and then analysing the error patterns allows teachers
and principals to make hypotheses about why groups of students make particular errors. As
mentioned below, more detailed analysis by QCAA staff is available from the QCAA website.
Steps for analysis might be as follows:
 Compare the facility rates (percentage correct) achieved by the school’s students with the
national and state results available in this document. Is their performance consistent?
 Look at the common errors made by their students and compare them with the common
errors made in the state. Only errors from Queensland students are available and are found
in the item analyses that are part of SunLANDA Online.
 Form hypotheses about why students are making these errors, e.g.
- How did students think about this aspect of the curriculum?
- What misunderstandings might these errors represent?
- How might the structure of the test question have shaped the response?
Using a combination of the NAPLAN data, school data and professional judgment, teachers may
then test these hypotheses to see whether they are valid or whether there is more to be thought
about and investigated. Teachers can then plan lessons related to the general areas where
students seem to need help. Teachers can also make judgments about teaching approaches and
curriculum.
The professional conversations that are part of this process are the most effective and powerful
way to use the data, as they are the vehicle for developing shared understandings.

Placing the tests in the assessment context


The results from the NAPLAN tests should be seen as only one input into a school’s assessment
program. Various forms of assessment are needed to inform the different stages of the teaching
and learning cycle. Principals and teachers should keep in mind that NAPLAN is a point-in-time,
timed test that can only cover a few curriculum features.
The results from a school’s own assessments of students should be consistent with the NAPLAN
test results. If the test results are different from what was expected, consider possible reasons.
The results of the tests may indicate aspects of student performance that need further
investigation within the classroom, using other forms of assessment.
An item with a low facility rate (percentage correct) may not necessarily indicate a problem in
teaching and learning. It may be that this was simply a difficult item for all students in this cohort
across Australia.

NAPLAN Queensland Curriculum & Assessment Authority


2018 State report – Year 9 February 2019
Page 3 of 32
Other NAPLAN reports
In addition to the State reports, the following reports are produced about the performance of
Queensland students who sit the NAPLAN paper tests:

SunLANDA Online
Since 2015, student data has been released on the QCAA School Portal using the SunLANDA
Online interface. Access to SunLANDA as application software is also still available on the QCAA
website.
SunLANDA Online provides class and school information in an electronic form that permits
customised spreadsheet generation by users. In addition, it shows representative samples of
students’ incorrect responses to constructed responses where applicable. Hyperlinks from within
SunLANDA Online lead to the QCAA’s test item analysis. Information on how to use this service
is available at: www.qcaa.qld.edu.au/p-10/naplan/test-reporting-analysis/sunlanda/accessing-
navigating-sunlanda.

Test item analysis


These PDF documents contain an analysis of each test item. They can be downloaded directly
from the QCAA website: https://www.qcaa.qld.edu.au/p-10/naplan/test-reporting-analysis/test-
item-analysis. A school Brief Identification Code (BIC) and password is required to access these
documents. The analysis reproduces each item followed by expert analyses of how the item
operated. It shows the distractors presented in multiple-choice items and explains students’
reasoning.

School and class reports


The NAPLAN school and class reports are supplied electronically on the secure section of the
QCAA website. These reports are accessible only with the school’s Brief Identification Code (BIC)
login and password. Individual student reports are distributed to schools as printed copies.

School reports
The QCAA issues NAPLAN school reports giving information about each school’s performance.
They provide a summary of year-level performance as well as performance by gender, language
background and Indigenous status in the following fields:
 distribution of scaled scores
 distribution of achievement bands
 school and state means
 participation of the group.
The school report positions a school’s performance within the state on a graph that is shaded to
show the range of performance for the middle 60% of Queensland students together with the
state mean.

Class reports
The QCAA issues NAPLAN class reports that show the performance of every student on every
item. Under the name of each student is recorded the items they had correct and incorrect. They
also show students’ responses to constructed-response items.
The class report also gives the:
 percentage correct for each item for the class and state, and by gender

NAPLAN Queensland Curriculum & Assessment Authority


2018 State report – Year 9 February 2019
Page 4 of 32
 scaled scores for each student
 performance bands for each student.

Individual student reports


The QCAA issues individual student reports to schools after the tests. Schools receive one
printed report for each student to distribute to parents/carers.

ACARA reports
As well as the Queensland reports from the QCAA, national reports are available from the
website of the Australian Curriculum Assessment and Reporting Authority (ACARA). The
NAPLAN National Summary Report and the NAPLAN National Report allow states and territories
to place the achievement of their students in relation to their peers across the nation. This is
system-level information and is publicly available.

NAPLAN Queensland Curriculum & Assessment Authority


2018 State report – Year 9 February 2019
Page 5 of 32
Lite
eracy
Writiing
Stimulus (writing prompt)) Years 7 & 9

About the task


In 2018,, the NAPLAAN Writing te est was base ed on the peersuasive geenre. As hass been the case
c
since 20015, two pro
ompts were used:
u one foor Years 3 & 5 and another for Yearrs 7 & 9. The e test
conditions and adm ministration reemained the e same as in n previous yeears, i.e. teaachers delivered the
same sp poken instructions and read
r the textt aloud to students. Working indepeendently, stu udents
had to pplan, composse and edit a written ressponse. Stud dents were allowed fivee minutes to plan,
thirty minutes to write their scrip
pt, and a furt
rther five min
nutes to editt and compleete the task. Three
pages w were provide
ed for studen nts to write a response.
The 201
18 prompt foor Years 7 & 9 was titled
d New techn
nology. Stude
ents were assked, in the textual
componnent of the prompt, to:
 WWrite a perssuasive text about a piecce of techno ology that ha
as been or w will soon be invented
tthat will makke life so mu
uch better.
 Persuade a reader why y this device or machine e, such as a self-driving car, a drone
e
helicopter or a fingerprinnt lock for a bike will be
enefit people
e.
Studentss were told they
t could use
u their own
n idea, or an
n idea mentioned or pict
ctured on the
e page.

NAPLAAN Queensland Curriculum & Assessment


A A
Authority
2018 State report – Ye
ear 9 Februaary 2019
Page 6 of 32
3
Additional textual information was provided. This named the structural components, and further
defined these elements, e.g. Start with an introduction. An introduction lets the reader know what
you are going to write about. Other notes were also provided in relation to the conventions
associated with the writing task, e.g. write in sentences, check and edit your writing. Five images
were shown on the prompt sheet, covering a range of possible topics/ideas such as wind turbines
(technology for renewable energy), mechanical artificial limbs that use technology and an app for
a mobile phone.

The prompt was relatively open-ended, allowing students to base their writing on one or more of
the images provided, or compose their text around their own idea.
Markers for this Writing test were trained using the national persuasive writing marker training
package, delivered as part of ACARA’s national assessment program. Markers were recruited
and trained in accordance with national protocols. Registered Queensland teachers marked the
NAPLAN Writing test scripts. All markers applied the 10 criteria and related standards from the
marking rubric. Writing test scripts were marked on screen in all states and territories. Stringent
quality-control measures were applied to the marking of student scripts, including a prescribed
percentage of scripts to be double-marked, and the daily application nationally of control scripts
for all markers. As part of the Queensland marking operation for 2018, referee marking continued,
further ensuring marking reliability. There was also provision for appeal over individual Writing
test scores after the results were released.
An earlier version of the NAPLAN Persuasive writing marking guide is available at:
www.nap.edu.au/_resources/Amended_2013_Persuasive_Writing_Marking_Guide_-
With_cover.pdf

Performance
The 2018 writing prompt was particularly suited to many students in Years 7 and 9 as technology
is something they know about, have an interest in and have seen develop throughout their life.
For many students it would be something they have looked at in their STEM subjects and studies.
The prompt did provide some examples of technology for students and did also allow students to
select a topic of personal interest with which they had some familiarity. It appears that few
students had difficulty in finding a subject on which their text could be based, and the majority of
students went with suggestions from the prompt, particularly self-driving cars and fingerprint locks
for bikes.
Students typically introduced their subject (e.g. a piece of technology that will make life better) in
an opening paragraph, stating broadly why this piece of technology would make life better, then
provided information regarding the uses and benefits of the technology. The danger here was that
following the introduction, some students tended to move into informative text rather than
persuasive text. Conclusions then tended to focus on simple re-statements of main points
referred to in the body of the text. Often there was a very close parallel between the wording of
the introduction and conclusion.
If students did follow the informative line, without taking a stance on the subject in question, they
were deemed to be ‘off genre’, an outcome that may have had considerable impact on their
scores. Some students, particularly Year 9 students, adopted a ‘review’-type response, tending to
provide information rather than persuasive argument, defending their choice of technology.
Typically, this information was sandwiched between an introduction and conclusion which
reflected some persuasive elements. These students were deemed to be ‘on-genre’, though their
final scores were impacted by the absence of persuasive elements throughout their texts.
In most cases, students in Year 9 showed competence with the persuasive form. Introductions
provided more natural orientation for the reader, with quite passionate statements of position
often presented.

NAPLAN Queensland Curriculum & Assessment Authority


2018 State report – Year 9 February 2019
Page 7 of 32
For example:
Have you ever looked up into the sky and wondered, how did we get here? Have you ever
seen the utter smile on a child’s face when they gaze at the effervescent stars? The love of
the outside universe makes our world seem so wonderful. Imagine if we had a vehicle that
could take you there. Ten billion light years in a flash! We could travel anywhere, see
anything, and explore to our absolute limits. Would you like to spend every day like a
holiday? Zooming through the stars, fascinating your inner child? Well, in the future, you’re
in for a must-have treat! (Year 9 student)
As the Writing test is an ‘on-demand’ task, where students are given 5 minutes planning time, 30
minutes to write their persuasive task and 5 minutes for editing and completing their work,
students frequently lacked some detail about the technology they were referring to. Though time
in a demand writing task obviously affects the way in which student writing concludes, many
responses included only a brief summation or re-statement of the text. Those students who, in
conclusion, challenged the reader to fully consider the arguments presented, produced highly
effective closure to their writing, as the following extract demonstrates:
The importance of self-driving cars is undeniable. However, sometimes they can do more
harm than good. Self-driving cars would make the world a much safer place and ensure
that cars are used more for good than for bad. By assisting tired drivers, transporting
unwell passengers and preventing reckless driving, these cars will make the world a
better place. The world is always changing, sometimes for better, sometimes for worse,
and self-driving cars are one significant invention that could improve the amazing world
that we call home. (Year 9 student)
Though scored independently, the strong connection between the criteria of audience, persuasive
devices and ideas in this type of task rewards students who are prepared to take on ‘the big idea’
and explore this in a more engaging way. Alternatively, students who wrote with passion and
commitment about a subject they knew very well, produced credible responses.
The NAPLAN marking rubric also allocates significant score points to the skills areas of sentence
structure, punctuation and spelling. Persuasive writing almost encourages the use of adverbial
clauses and phrases indicating causation and condition. Stronger scripts showed variety and
control over complex sentence forms, with ‘punchy’ simple sentences (and even sentence
fragments) occasionally used for marked effect. Punctuation, particularly sentence boundary
punctuation, still indicates room for improvement. The use of ‘run-on’ sentence forms, often
associated with the use of ‘splice commas’ as breaks, is a significant factor. Much of this has to
do with the shift from oral to written language modes. The formalities associated with writing, in
test conditions and elsewhere, need constant attention in writing programs. Punctuation can
easily become a casualty of contemporary communication forms if it is not dealt with through a
rigorous and contextualised writing program, closely associated with reading good-quality
published texts.
In 2018, the use of improbable data and quotations was less evident than in previous years.
Teachers should be wary of approaches to writing that suggest the inclusion of implausible data
or dubious supporting evidence attributed to some ‘invented’ authority. Markers are trained to
accept at face value what is on the page, but irrelevant or erroneous information does little to
support a writer’s point of view, particularly at the higher end of the writing spectrum.
Undoubtedly this year, students were more acquainted with the topics of their choice and could
‘flesh out’ arguments with plausible supporting detail. Some students’ field knowledge led them to
write quite detailed texts designed to persuade readers to agree with the stance they were taking
regarding the type of technological improvements that could be made.

NAPLAN Queensland Curriculum & Assessment Authority


2018 State report – Year 9 February 2019
Page 8 of 32
Sample
e script

NAPLAAN Queensland Curriculum & Assessment Authority


A
2018 State report – Ye
ear 9 Februaary 2019
Page 9 of 32
NAPLAAN Queensland Curriculum & Assessment Authority
A
2018 State report – Ye
ear 9 Februaary 2019
Page 10 of 32
Commentary on sample script
This persuasive text is focused on the development of virtual reality. It is authoritative and
impassioned. The writer adopts an enthusiastic, expert stance, setting the scene by first
mentioning technology and technological advances, leading into the main subject of the script,
the rapidly growing medium of virtual reality.
The script orients the reader and provides sufficient information for the reader to follow easily.
The introductory paragraph informs the reader that this unique platform … is perfect for a wide
variety of tasks, and then proceeds to mention some of the ways virtual reality can be used.
These tasks are then elaborated in the body of the text.
The language choices throughout the text are chosen to encourage the reader to continue
reading to understand fully what the writer is informing them of. The writer takes the stance of an
expert and enthusiastically imparts the knowledge she has of the uses of virtual reality.
Following the introduction, the writer describes the use of virtual reality as a powerful and
intriguing teaching device and provides examples of different ways it could be integrated into the
classroom. This paragraph is developed to show the reader that virtual reality is not a toy but
rather a valuable teaching tool as it can clearly show students a range of realistic environments
as they are learning about various topics.
The next paragraph informs the reader that virtual reality teaching is still in its very early stages
and that there is another side to virtual reality that’s far more popular. The writer then informs the
reader of the artistic side of the medium, especially gaming, painting and sculpting.
The tone of the text is lively and excited as the reader is informed of all the possibilities for virtual
reality. Throughout the text the writer’s position is clear. The student effectively uses some words
and phrases to make the point that virtual reality is continually developing and persuades the
reader that it has many extensive and varied attributes that are continually developing.
Structurally, all textual elements work in a coordinated and deliberate fashion as the argument is
built. The text is highly cohesive, and the choice of text connectives and conjunctions strongly
contribute to this, together with lexical chains that effectively link the stages of the text. The
vocabulary is well suited to the overall flavour of this text.
Similar to other aspects of the text, the skills areas of spelling, punctuation, paragraphing and
sentence structure are well constructed, making this script an excellent example of a demand
writing task.

NAPLAN Queensland Curriculum & Assessment Authority


2018 State report – Year 9 February 2019
Page 11 of 32
Language conventions
Spelling
Results and item descriptions
The percentage columns give the facility rate (percentage correct).

Item Answer QLD % Aust. % Description

Proofreading — Error identified

1 garage 90.71 91.57 Corrects the spelling of a two-syllable word


ending in -age.

2 research 89.20 88.78 Corrects the spelling of a two-syllable word with


ear.

3 subscription 75.48 74.88 Corrects the spelling of a three-syllable word


with scrip.

4 familiar 61.44 62.68 Corrects the spelling of a three-syllable word


ending in -liar.

5 incredible 68.70 68.66 Corrects the spelling of a multisyllable word


ending in -ible.

6 cereal 52.71 54.93 Corrects the spelling of a three-syllable


homophone starting with ce-.

7 distressed 47.99 49.56 Corrects the spelling of a two-syllable word


ending in -ssed.

8 monitored 42.19 43.79 Correctly spells a 3-syllable word ending in


-ored.

9 feud 35.53 38.94 Corrects the spelling of a single-syllable word


with eu.

10 maintenance 26.05 26.63 Corrects the spelling of a three-syllable word


ending in -enance.

11 adjoining 25.94 26.70 Corrects the spelling of a three-syllable word


starting with adj-.

12 grievances 28.50 28.01 Corrects the spelling of a three-syllable word


ending in -ances.

Proofreading — Error not identified

13 telescope 80.00 80.78 Corrects the spelling of a three-syllable word


starting with tele-.

14 wisdom 71.77 75.05 Identifies and corrects an error in a two-syllable


word with is.

15 specialist 66.77 67.15 Identifies and corrects an error in a three-


syllable word ending in the suffix -ist.

16 infectious 57.63 59.87 Identifies and corrects an error in a three-


syllable word ending in the suffix -ious.

NAPLAN Queensland Curriculum & Assessment Authority


2018 State report – Year 9 February 2019
Page 12 of 32
17 casualty 54.67 55.71 Identifies and corrects an error in a three-
syllable word ending in -ualty.

18 suburban 46.29 47.70 Identifies and corrects an error in a three-


syllable word ending in -an.

19 scenario 35.95 36.56 Identifies and corrects an error in a multisyllable


word starting with sc-.

20 juvenile 33.64 33.96 Identifies and corrects an error in a three-


syllable word ending in -enile.

21 dissatisfaction 23.10 23.05 Identifies and corrects an error in a multisyllable


word with the prefix dis-.

22 intrepid 7.87 8.70 Identifies and corrects an error in a three-


syllable word ending in -id.

23 guarantee 19.55 20.67 Identifies and corrects an error in a three-


syllable word starting with gu-.

24 indispensable 20.82 21.11 Identifies and corrects an error in a multisyllable


word ending in -able.

25 plagiarism 2.59 3.57 Identifies and corrects an error in a multisyllable


word with iar.

About the test


The 2018 Year 9 test focused on the following spelling features:
 affixes: familiar, incredible, monitored, maintenance, grievances, adjoining, specialist,
infectious, casualty, suburban, dissatisfaction, intrepid, indispensable
 affixes with consonant alternation: subscription, wisdom
 complex vowels and consonants: research, feud, guarantee, plagiarism
 Greek and Latin elements: telescope, scenario, juvenile
 open/closed syllable: garage
 homophone: cereal (serial)
 tense inflection: distressed.
Both sets of questions in the Spelling test use proofreading formats. The target words in the first
set are misspelt and identified by being circled. Those in the second set are misspelt but not
identified, so students need to find them inside sentences containing other words (distractors)
that are correctly spelt. These supplied misspellings may lead students to spell differently from
when they write to dictation or compose their own sentences.

Performance
Compared to the national average, Year 9 students in Queensland performed only slightly lower
on most words. They did marginally better on the words research, subscription, incredible and
grievances. Compared to previous Queensland cohorts, this year’s Year 9 students performed
slightly better in Spelling.
Between 10% and 15% of students omitted any response to the last eight words. The reading
load and vocabulary demand in these items was high.

NAPLAN Queensland Curriculum & Assessment Authority


2018 State report – Year 9 February 2019
Page 13 of 32
Facility rates fell below 10% for the words plagiarism and intrepid, indicating that few students in
the cohort have knowledge of the words.
Common student errors show the following:
 Instead of treating suffixes as meaning chunks with set spellings, many students still
attempt to ‘sound them out’. This resulted in misspellings such as specialest, monitered,
protestor and familier.
 Many students still struggle with the suffixes -ible/-able and -ence/-ance (e.g. incredible,
indispensable, maintenance, grievances).
 There was a surprising difficulty with the word infectious (infectous 11%, infectus 3%).
 Many students misspelt the base word parts of the words grievance, casualty and
scenario (grieve, casual and scene).
 For some of the items in the second set (e.g. guarantee, dissatisfaction and specialist), a
very large percentage of students changed the correct spelling of the distractor words
(e.g. refrigerator, protesters and referred) instead of trying to spell the target word.
 Many students may have been influenced by the supplied misspelling of the word
distressed as distrest.

Implications for teaching

Test-wiseness
Ignore supplied errors: Students should avoid being influenced by the supplied misspelling of a
word. Instead, they should focus on their own spelling knowledge and strategies.

Word study
Students need to build a mature vocabulary. Word study belongs in all subjects, not just English.
Students should learn:
 the pronunciation of advanced words and their look on the page
 the meaning of their component morphemes (word parts)
 other words built from the same stem.
Students may learn to avoid inappropriate ‘sounding out’ strategies if they know that suffixes and
inflections have grammatical effects and stable spellings. For example, they should be able to
reason that the final syllable of specialist contains the agent noun suffix -ist, not the superlative
adjective suffix -est.
In the light of the poor facility with the word infectious, teachers in subject English might focus on
the regular patterns for suffixes. For example, the suffix -ous belongs with other suffixes that form
adjectives, such as -y, -able, -acious, -ic, -al, -ish, and -est. From another angle, -ous words can
be divided into those with a /sh/ sound, e.g. suspicious, and those without, e.g. indigenous.
Exceptions, rarities and oddballs should be noted as well as the general patterns.

Proofreading
Proofreading skills should be taught as an authentic writing skill. This will incidentally help
students read test questions carefully and avoid being misled by supplied errors.

QCAA resources
Full analysis of student performance and error patterns for each item is published in the
SunLANDA program: https://www.qcaa.qld.edu.au/p-10/naplan/test-reporting-
analysis/sunlanda/accessing-navigating-sunlanda and as PDF documents:

NAPLAN Queensland Curriculum & Assessment Authority


2018 State report – Year 9 February 2019
Page 14 of 32
https://www.qcaa.qld.edu.au/p-10/naplan/test-reporting-analysis/test-item-analysis. A school BIC
and password is required to access each year level document.

NAPLAN Queensland Curriculum & Assessment Authority


2018 State report – Year 9 February 2019
Page 15 of 32
Grammar and punctuation
Results and item descriptions
The percentage columns give the facility rate (percentage correct).

Item Answer QLD % Aust. % Description

26 B 92.22 93.05 Identifies subordinating conjunction to complete


a complex sentence.

27 D 83.96 85.15 Identifies a sentence from a review containing an


opinion.

28 C 82.59 81.44 Identifies a simple sentence that needs quotation


marks.

29 C 83.84 84.68 Identifies a complete sentence.

30 B 79.31 79.48 Identifies the reference for a pronoun in a short


passage.

31 A 69.58 70.51 Identifies a sentence with an embedded clause


error.

32 B 58.78 55.76 Identifies the correct punctuation of quoted


speech with internal attribution.

33 D 64.59 64.29 Identifies the sentence expressing certainty.

34 A 59.44 60.47 Identifies the correct use of a colon.

35 A 71.39 74.23 Identifies the correct placement of commas in a


list with single items.

36 D 61.82 63.01 Identifies a sentence that correctly combines


information from three separate sentences.

37 A 53.82 53.45 Identifies a preposition in a simple sentence.

38 C 37.37 38.18 Identifies a sentence without a parallel


construction error.

39 B 40.71 40.32 Identifies an adverb in a simple sentence.

40 B 35.59 35.42 Identifies the correct use of apostrophes of


possession for plural nouns.

41 D 47.98 46.62 Identifies an adverb in a complex sentence.

42 D 20.50 20.16 Identifies a colon as the correct punctuation


required in a complex sentence.

43 B 25.20 23.36 Identifies the correct use of dashes for a list in a


simple sentence.

44 C 24.84 23.12 Identifies an embedded adjectival clause.

45 A 24.41 25.43 Identifies jargon in a compound sentence.

46 A 27.73 29.66 Identifies that a verb is missing in a simple


sentence.

47 BD 11.84 11.47 Identifies semicolons as correct punctuation to

NAPLAN Queensland Curriculum & Assessment Authority


2018 State report – Year 9 February 2019
Page 16 of 32
separate extended items in a list.

48 C 19.03 18.52 Identifies the non-finite verb to complete a


complex sentence.

49* 2,1,2,2 15.95 15.52 Identifies four sentences as being in active or


passive voice.

50 D 13.61 10.47 Identifies a noun phrase in a complex sentence.


* Item 49: Answers in sequence: passive, active, passive, passive.

About the test


The NAPLAN Language conventions items test sentence-level, clause-level and word-level skills.
The test does not cover the curriculum. Instead, it tells how a large number of students perform
on a small range of tasks. Standardised tests can, however, suggest broad trends across a
cohort. At the level of individual students, NAPLAN results can supplement classroom
assessments and guide teachers to important points of grammar and punctuation that need
revisiting. The Language conventions test comprises 50 items in total: 25 Spelling items and 25
Grammar and punctuation items.

In 2018, about one half of the questions were focused on students’ knowledge of the terminology
of grammar and punctuation terms (the ‘metalanguage’). The number and complexity of
metalanguage questions has been increasing over the years.
Students needed to know the terms for the parts of speech to answer the following questions:
 Item 37, Item 39, Item 41 and Item 46 on adverbs
 Item 50 on noun groups.
These questions sometimes also referred in the distractor options to the names of other parts of
speech, e.g. preposition, conjunction, adjective, verb, noun.
Students needed to know the terms for structural parts of sentences to answer the following:
 Item 31 on clause order (i.e. a relative clause follows the subject noun)
 Item 44 on adjectival (relative) clauses
 Item 49 on active/passive voice.
These questions sometimes also referred in the distractor options to the names of other structural
elements, e.g. adverbial clause, main clause, adverbial group and adjectival phrase
Students needed to know the terms for punctuation marks to answer the following questions:
 Item 28 on quotation marks for direct speech and attribution
 Item 42, Item 34 and Item 47 on colons and on semicolons (in complex lists)
 Item 43 on dashes (i.e. em rules separating a parenthetical phrase).
The following questions could be answered without knowledge of metalanguage:
 Item 26 and Item 36 on conjunctions
 Item 29 on subordinate clauses (as fragments needing a main clause)
 Item 30 on pronoun reference
 Item 32 on punctuating and attributing direct speech
 Item 35 on list commas

NAPLAN Queensland Curriculum & Assessment Authority


2018 State report – Year 9 February 2019
Page 17 of 32
 Item 38 on parallel structure
 Item 40 on punctuating possessive pronouns, plural possessives and contracting ‘it is’
 Item 46 on the subject–verb–predicate structure and on gerund phrases
 Item 48 on participial clauses.
For information about the full range of grammar knowledge that Year 9 students should have,
refer to the Australian Curriculum English.

Performance
Year 9 students in Queensland performed marginally higher than the national average in many
items. Compared to previous Queensland cohorts, Year 9 students performed a little better.
Many students, including many of the high-performing ones, struggled with the metalanguage
questions. The reasons for this vary. In the case of Item 42, the terms ‘colon’ and ‘semicolon’ do
not themselves create the low facility but the lack of the important knowledge of how to use these
signs. However, the low facility for Item 49 is likely due to the special grammar analysis task it
sets. In the case of Item 50, even many students who understood the terms ‘noun group’ and
‘main clause’ did not know how to interpret this example. The low facility and the weak statistical
‘fit’ for Item 46 (on parts of speech) may also have been due to the item format. Few students
omitted responses to any items, except for Item 45.

Implications for teaching


Grammar and punctuation is not a separate area of reading and writing, but an essential
component. The contexts of language in use are pivotal to a full understanding of how syntax,
grammatical and punctuation conventions operate. Wherever possible, meaning and purpose in
texts should be emphasised, noting how particular language conventions contribute to making
meaning.

The Australian Curriculum English allows for the progressive teaching knowledge and
understanding of grammatical concepts and punctuation conventions. Teaching the curriculum
implies teaching the terminology of grammar (metalanguage) where this helps the effective use of
language to communicate in different contexts and for different purposes.

QCAA resources
Full analysis of student performance and error patterns for each item is published in the
SunLANDA program: https://www.qcaa.qld.edu.au/p-10/naplan/test-reporting-
analysis/sunlanda/accessing-navigating-sunlanda and as PDF documents:
https://www.qcaa.qld.edu.au/p-10/naplan/test-reporting-analysis/test-item-analysis. A school BIC
and password is required to access each year level document.

NAPLAN Queensland Curriculum & Assessment Authority


2018 State report – Year 9 February 2019
Page 18 of 32
Reading
Results and item descriptions
The percentage columns give the facility rate (percentage correct).

Item Answer QLD % Aust. % Description

Dear Mr Mitchell
1 C 95.53 95.97 Identifies the author of a persuasive letter.

2 C 85.69 88.98 Interprets a cohesive reference used in a


persuasive letter.

3 B 96.79 97.33 Locates directly stated information in a persuasive


letter.

4 B 16.15 16.88 Evaluates the purpose for the use of italics in a


persuasive letter.

5 D 89.47 89.93 Interprets techniques used by an author to


position a reader in a persuasive letter.

6 A 27.95 29.49 Interprets a rhetorical device in a persuasive


letter.

Our world of weird and wonderful plants


7 D 79.85 78.01 Identifies the reader's expected reaction in an
information text.

8 A 91.56 92.02 Identifies the way the writer introduces an


information text.

9 C 64.24 65.56 Identifies the writer's focus in a section of an


information text.

10 B 75.34 76.34 Identifies information not given in an information


text.

11 D 86.16 87.30 Identifies an exception from comparisons in an


information text.

12 B 43.56 45.01 Identifies a definition in an information text.

Graffiti – is it art?
13 A 88.32 89.57 Identifies the reasoning behind an author's point
of view in an argument.

14 D 53.27 57.38 Identifies a persuasive technique in an argument.

15 B 63.19 65.22 Identifies evidence used to support a conclusion


in an argument.

16 C 24.20 26.46 Identifies an example of emotive language in an


argument.

17 B 67.92 70.24 Identifies a belief in an argument.

NAPLAN Queensland Curriculum & Assessment Authority


2018 State report – Year 9 February 2019
Page 19 of 32
18 A 59.33 60.77 Compares attitudes represented in two
arguments.

The high life


19 D 67.95 69.81 Identifies the reason for information in brackets in
a factual description.

20 C 40.19 39.94 Interprets directly stated information in a factual


description.

21 C 71.67 73.98 Interprets information in a factual description.

22 B 39.88 40.14 Identifies the effect of an intensifier in a factual


description.

23 B 62.26 63.90 Interprets information in a factual description.

24 C 53.37 56.49 Interprets information in a factual description.

The forging
25 C 84.11 85.9 Interprets the purpose of text structures and
language features in a narrative.

26 D 45.92 45.30 Interprets figurative language in a narrative.

27 B 52.31 53.78 Interprets technical language in a narrative.

28 D 69.64 69.92 Identifies a character's mood change across a


narrative.

29 A 60.19 61.04 Synthesises information from dialogue and


description in a narrative.

30 B 47.60 48.23 Interprets dialogue in the final two paragraphs of a


narrative.

31 4,1,3,2 29.42 29.31 Identifies the sequence of events in a narrative.

Computer gaming: sport or not?


32 B 42.97 45.12 Evaluates the central argument in a discussion
transcript.

33 C 67.28 69.22 Identifies supporting evidence in a discussion


transcript

34 D 72.59 74.57 Identifies a speaker’s argument in a discussion


transcript.

35 A 48.10 50.41 Analyses how a speaker uses persuasion in a


discussion transcript.

36 A 52.20 53.28 Identifies how a speaker uses persuasion in a


discussion transcript.

37 AC 6.88 7.40 Synthesises information from across a text in a


discussion transcript.

Lost for words


38 AE 24.08 26.17 Identifies how a title reflects aspects of a speech
with internal commentary.

NAPLAN Queensland Curriculum & Assessment Authority


2018 State report – Year 9 February 2019
Page 20 of 32
39 BD 40.96 44.10 Infers a character's perspective in a speech with
internal commentary.

40 C 74.50 76.87 Analyses the effect of language devices in a


speech with internal commentary.

41 D 39.67 39.91 Interprets an allusion in the middle of a speech


with internal commentary.

42 BCD 6.4 7.31 Identifies a character's perspective in a speech


with internal commentary.

43 ABGH 29.69 32.22 Interprets an implied meaning in a speech with


internal commentary.

Nonsense rules!
44 A 47.35 48.31 Analyses the effect of how the writer introduces a
topic in a complex essay.

45 C 56.28 58.76 Extracts the meaning of less familiar vocabulary in


the middle of a complex essay.

46 B 41.16 42.35 Extracts key information and ideas from the


middle of a complex essay.

47 C 27.93 28.33 Identifies the writer’s opinion of the subject of a


complex essay.

48 AD 12.5 13 Applies understanding of complicated ideas from


a complex essay.

49 T, F, T, T 25.58 26.66 Identifies the validity of key information from


across a complex essay.

50 CDE 4.2 4.46 Infers multiple meanings from the title of a


complex essay.

About the test


The 2018 Reading test consisted of 50 items based on eight reading magazine units spanning
several genres. There were no short-response items. Most items were in a standard four-option,
multiple-choice format. However, this year’s test included eight items (up from four last year)
which required multiple responses. Sometimes the student is instructed to Choose two,
sometimes the student is instructed to Choose all, which further increases difficulty for students.
It is important to note that this type of item does not reward partial accuracy. As in previous years,
there was also an item that required students to indicate how information is sequenced in a text.
As is usually the case, there was a pattern of increasing difficulty as students progressed through
the test. Facility rates decline as students progress from the first reading unit to the last as the
reading demands generally increase, distractors appear to be more sophisticated and item
construction becomes more complex.
Rather than categorise the reading units by genre, it is more useful to categorise the items by
question type (see table below). This is partly because many of the reading stimulus texts contain
features of multiple genres. Furthermore, the error analysis of individual items sometimes shows
that students will select a distractor because it alludes to a genre. Even capable students will be
drawn to these distractors.
The Year 9 Reading test contained two link units with Year 7: The forging and Lost for words.

NAPLAN Queensland Curriculum & Assessment Authority


2018 State report – Year 9 February 2019
Page 21 of 32
Categorising items by question type puts the focus on the more relevant reading strategies. For
example, an examination of the question types makes it apparent that simply underlining or
highlighting text may only be useful for one of the seven question types — Literal-recall
questions. See the Implications for teaching section in this text, and SunLANDA for a range of
reading strategies that are specific to the types of questions.

Question type: The reader is asked to:


Literal Recall Recognise or recall information.
Translation Change information into a different form — it might involve
paraphrasing the ideas or restating them in terms or forms
other than those in the text.
Text-based Interpretation Identify the relationships among ideas, definitions, facts
inferential and values — these would involve such relationships as
comparisons and cause and effect; they involve a minimum
of higher-order thinking as the reader/learner needs only to
respond to and manipulate ideas in the text.
Higher-order Application Solve real-life problems by extrapolating what is in the text
(Context-based) — readers/learners need to combine ideas from the text
inferential with prior knowledge.
Logical analysis Analyse and judge the quality of the logic inherent in the
text — readers/learners might, for example, identify
fallacies or points of view represented in a text.
Creative Synthesis Respond to a problem or idea with original and creative
thinking.
Evaluation Make judgments with respect to specific criteria.

Performance
The level of difficulty increased as students progressed through the Reading test, with the first
units having slightly higher facility rates than the following units. This was more pronounced in
other year levels than the Year 9 Reading test. Performance in link items from The forging and
Lost for words ranged from 0.4% to 13.12% higher when compared to Year 7 performance.
The first two units, Dear Mr Mitchell and Our world of weird and wonderful plants, contained a
range of question types and were generally quite easy for many Year 9 students except for Item
4. Item 4 required students to evaluate the purpose of the use of italics which required a context-
based inference. Therefore, part of the difficulty is that it requires higher-order inference. Another
reason this was quite difficult for students is that one of the distractors was very difficult to
eliminate. Even though this item appeared to refer to one very specific part of the text, any
context-based inferential question requires an understanding of the whole text. This is a point
which many of the students appeared to miss.
The third unit, Graffiti – is it art?, was more difficult than the first two units. This unit had two items
that required creative thinking (question type). One of these, Item 16, proved to be the most
challenging in this unit. Several variables contributed to the difficulty of this item. Options were
drawn from both texts, Graffiti is art and Graffiti is vandalism. Students were required to know
what is meant by ‘emotive language’ and how to apply it in the context of the text. The item
analysis shows that able students struggled with this item. They had difficulty with the fact that to
identify language as emotive requires reading that language and its context. Many students
attempted to apply this in a lexical sense without referring to the context. Once again, the
implication is that students should respond to Reading test questions using the text.

NAPLAN Queensland Curriculum & Assessment Authority


2018 State report – Year 9 February 2019
Page 22 of 32
The High life and The forging had comparable average facility rates and question types. The
forging was a link unit from Year 7. On average, Year 9 students performed 7% better than Year
7 students. The lowest item facility rate in these two units was Item 31, which interestingly was a
literal-translation question type. This is likely because it is the first non-conventional item format in
the Reading test. It requires students to sequence four events. Many students were tempted to
sequence the events as they occurred in the text. These students did not understand that the first
event was one that appeared in the second paragraph of the text. This is often the case with
sequencing items in the Reading test.
The next unit, Computer gaming: sport or not?, was slightly more difficult for students. It
contained the third most difficult item in the Year 9 Reading test. Item 37 had a 6.88% facility rate
because it required higher-order inference and because it required students to select two keys.
Higher-order inference questions are generally more challenging because students are required
to synthesise information from across a text.
The next unit was Lost for words. Year 9 students appeared to find Item 42 very challenging as
they barely achieved a higher facility rate than Year 7 students for the same item. There were
several difficulties with this item. One was that the item required the selection of an unspecified
number of correct options, three in this instance. Another is that readers had to identify multiple
perspectives of the writer in a text where some readers struggled to distinguish the writer’s actual
perspective from what she stated. The very low facility rate for this item suggests that many Year
9 students may not have developed their ability to identify a character’s perspective and/or
manage item types like this.
In the final unit, Nonsense rules!, Item 50 was the most difficult item in the entire test (facility rate
4.2%). The omission rate was barely higher by the end of the test, so most students were still
engaged throughout the test. The complexity for students is increased through the requirement to
supply an unknown number of responses. In this case, three of the five options constitute the key.
Option E, which was selected by the fewest students, may have been missed because students
found it difficult to combine ideas from the entire text. Also, as has been seen with other items,
many students attempt to respond to an item using prior knowledge without referring to the text
adequately.
In summary, Year 9 students had 50 questions in the Reading test: twelve of these were literal,
twenty-two required text-based inference, fourteen required context-based inference and two
were creative. There appeared to be a slight increase in the average facility rate for creative type
questions, but it should be noted there were only two of this type of question. Otherwise it can be
seen that item difficulty increases with the degree of inference required. The item analysis shows
that complexity also depends on the reading difficulty of the stimulus text and the sophistication of
the distractors.

Question type Average facility rate Number of items


Literal 72.10 12
(recall and translation)
Text-based inference 52.64 22
(interpretation)
Context-based inference 39.01 14
(application and logical analysis)
Creative 41.77 2

NAPLAN Queensland Curriculum & Assessment Authority


2018 State report – Year 9 February 2019
Page 23 of 32
The items that involved purpose, tone and character generally had lower facility rates than literal
and lower-order inferential items. This is because they required higher-order reasoning and
comprehension. Students need to form an understanding of the whole text as well as pay
attention to subtle clues in the text that help them make the inferences. As a result, implications
for teaching should reflect these demands. Year 9 students did not perform as well in all the Year
7 link items as might be expected.

Implications for teaching


The lower facility rates on non-literal items demonstrates the importance of giving students
strategies to help them make inferences as they read, i.e. to make statements about the unknown
based on the known.
The challenge for teachers is to get students to annotate texts in the classroom and discuss them
in groups so that they can see how all the parts of the text contribute towards the meaning of the
whole. This is the time to discuss the patterns in the text (e.g. cause and effect), identify
connections between ideas in the text, the two or three main parts of the text and how the parts
contribute to the overall meaning. All of this should occur before they begin a close study of the
text. Students will handle the distractors in the items much better if they are clear about the
subject matter and the purpose of the text before they proceed to the items.
Teachers can encourage students to read for pleasure and recreation to extend their knowledge
of themselves and the world around them. Reading develops empathy for characters and people
in difficult situations. Students also need to be able to participate confidently in a close study of a
text, to check for fallacies and persuasive techniques, and to identify emotive language and
literary techniques. The goal should be for students to be discerning and capable readers and
confident speakers and writers about those texts.
The complexity of the reading process is made visible when students discuss texts and how they
arrive at their personal understanding of the text. Teachers are the facilitators of this process, not
the leaders. Their focus should be on:
 finding authentic texts which engage their students
 providing a range of genres and texts from classic or traditional texts to texts with
postmodern elements
 promoting higher-order questioning of texts (both set texts for special study and unseen
texts for close study)
 reading aloud to students to promote reading for pleasure (sometimes at Year 9 this is
forgotten)
 developing an awareness of how the parts of the text combine to create a whole through
both semantic (links between the ideas) and syntactic (grammatical links) cohesion
 encouraging students to make inferences as they read (an informed guess backed by
evidence or a statement about the unknown based on the known)
 encouraging the link between reading and writing by asking students to regularly write
analytical paragraphs about an aspect of what they have read (which includes a controlling
central idea) in response to the question, e.g. Can this character be trusted? Is there a shift
in tone in this text?
 encouraging students to look deeper into a text by drawing on analytical skills, e.g. explore
gaps and silences, consider writer bias, look at contradictions within the text, look for
themes, hidden purposes and so on and re-read the text from another perspective (e.g. a
contemporary, feminist, eco-critical perspective).

NAPLAN Queensland Curriculum & Assessment Authority


2018 State report – Year 9 February 2019
Page 24 of 32
QCAA resources
Full analysis of student performance and error patterns for each item is published in the
SunLANDA program: https://www.qcaa.qld.edu.au/p-10/naplan/test-reporting-
analysis/sunlanda/accessing-navigating-sunlanda and as PDF documents:
https://www.qcaa.qld.edu.au/p-10/naplan/test-reporting-analysis/test-item-analysis. A school BIC
and password is required to access each year level document.
Additionally, further advice and support can be found in: QCAA 2015, Beyond NAPLAN: How to
read challenging texts, Beyond NAPLAN series:
www.qcaa.qld.edu.au/downloads/p_10/naplan_read_challenging_texts.pdf.

NAPLAN Queensland Curriculum & Assessment Authority


2018 State report – Year 9 February 2019
Page 25 of 32
Numeracy
Results and item descriptions
The Numeracy strands from the Australian Mathematics Curriculum are abbreviated as follows:
Number and Algebra (NA); Measurement and Geometry (MG); Statistics and Probability (SP).
All items are worth one score point. However, there is a range of difficulty across the items.
The percentage columns give facility rates (percentage correct) for each item.

Item Strand Answer QLD % Aust. % Description

Calculator-allowed items

1 MG A 93.05 92.93 Identifies the top view of a collection of


cylinders.

2 MG A 81.59 82.48 Converts measurements to a common unit to


compare lengths.

3 SP B 77.23 78.13 Describes the effect of removing an outlier on


the mean of a set of data.

4 SP B 65.73 68.61 Selects the most likely results in a spinner


experiment.

5 NA 50 72.01 72.52 Calculates the difference between two


integers.

6 NA A 70.16 72.93 Calculates the profit made in a simple


financial plan.

7 MG B 62.78 65.39 Solves a timetable problem involving 24-hour


time.

8 NA 88.5 30.54 32.07 Divides and multiplies a decimal by whole


numbers in context.

9 NA A 78.98 79.04 Selects a graph showing a constant rate of


decrease to match a real situation.

10 MG B 77.41 79.22 Calculates the perimeter of a kite from one


side length and the ratio of its sides.

11 NA A 60.29 62.78 Uses multiples of whole numbers to solve a


problem in context.

12 MG E 58.43 59.48 Locates the relative position of a point on a


grid given compass directions.

13 NA D 52.16 55.02 Solves a division problem to determine the


remainder.

14 NA C 61.35 64.11 Subtracts decimals and multiplies by one


million.

15 NA C 71.18 71.94 Evaluates an algebraic expression by


substituting a value into the equation.

16 MG D 60.36 59.96 Identifies the face opposite a given face on


the net of a cube.

NAPLAN Queensland Curriculum & Assessment Authority


2018 State report – Year 9 February 2019
Page 26 of 32
17 MG C 58.61 59.97 Calculates the size of angles using properties
of angles on parallel lines.

18 SP C 66.10 67.51 Identifies the most representative sampling


strategy.

19 SP B 47.89 49.36 Calculates the probability of the complement


of an event using a Venn diagram.

20 NA B 50.98 51.33 Adds and subtracts fractions with unlike


denominators in context.

21 NA C 39.73 41.94 Solves a problem using simple ratios and the


addition and subtraction of decimals.

22 NA B 37.74 37.76 Determines the rule for a linear relationship


on the Cartesian plane.

23 NA D 40.99 43.24 Selects the appropriate order of operations to


give a correct expression.

24 SP 150 21.55 25.01 Identifies the median from a stem-and-leaf


plot.

25 NA D 34.19 36.39 Uses algebraic reasoning to solve a problem


involving the four operations.

26 MG 95.8 23.01 27.28 Calculates the circumference of a circle given


its diameter.

27 NA C 34.64 35.90 Compares items of different size by


determining the unit price of each.

28 MG D 24.79 27.14 Identifies the triangle congruent to a given


triangle and correctly labelled.

29 SP D 44.40 48.62 Uses data from a sample to predict the most


likely results for a population.

30 MG 693 6.62 9.06 Calculates the volume of a solid given the


area of its faces.

31 MG C 18.98 18.76 Calculates the capacity of a cylinder to solve


a problem involving rates.

32 MG AC 19.26 19.38 Rotates and reflects a kite on the Cartesian


plane and identifies its properties.

33 MG 42 12.39 14.05 Calculates the length of a diagonal of a


rectangle using multiplicative reasoning.

34 NA 25 8.87 9.58 Interprets a non-linear graph to solve a


multistep problem.

35 SP 0.16 7.71 8.31 Calculates the probability of a two-step


experiment using a tree diagram.

36 NA CF 9.09 12.25 Applies the index laws to a numerical


expression with integer indices.

37 SP 0.15 4.87 5.96 Uses complementary events and the sum of


probabilities to solve a problem.

38 MG 144 6.77 8.29 Solves a problem using the perimeter of a

NAPLAN Queensland Curriculum & Assessment Authority


2018 State report – Year 9 February 2019
Page 27 of 32
trapezium and parallel lines properties.

39 NA 4 6.03 7.22 Calculates the simple interest rate given the


principal, time and amount accrued.

40 NA 112.5 4.99 6.47 Uses direct proportion to solve a problem


involving profit.

Non-calculator items

1 NA C 76.37 78.61 Divides a decimal by a whole number.

2 NA D 59.65 62.05 Converts between fractions and percentages.

3 NA 530 27.70 31.27 Calculates a simple fraction of a quantity


where the result is a whole number.

4 NA C 34.52 36.27 Solves a problem involving the combination


of multiples of 3, 7 and 9.

5 NA 36.55 20.66 23.23 Multiplies, adds and subtracts decimals


without digital technologies.

6 NA 3 213 000 15.26 16.92 Calculates the cost of an item after a


percentage discount.

7 NA CE 17.41 19.53 Applies the laws and properties of arithmetic


to algebraic terms and expressions.

8 NA 1.6 8.19 10.25 Solves a problem involving dividing a quantity


in a given ratio.

About the test


The Year 9 Numeracy test consisted of 48 items covering concepts and skills from the Australian
Mathematics Curriculum across two tests — a Calculator-allowed (CA) paper with 40 items and a
Non-calculator (NC) paper with 8 items. However, not all items in the Calculator-allowed paper
required the use of a calculator.
Student results for Numeracy in Year 9 are reported as a single score.
The distribution of the 48 items across the Australian Mathematics Curriculum strands was:
 Number and Algebra (26 items)
 Measurement and Geometry (14 items)
 Statistics and Probability (8 items).
All 8 items in the Non-calculator paper were from the Number and Algebra strand.
Over the entire test of 48 items, 32 (67%) were multiple-choice, with the remaining 16 items
(33%) requiring students to arrive at their own answers (constructed responses). Interpretation of
tables, graphs or diagrams was required in 27 of the 48 items (56%). The other 21 items (44%)
were word problems that usually incorporated numerals within the information and question
stems.

NAPLAN Queensland Curriculum & Assessment Authority


2018 State report – Year 9 February 2019
Page 28 of 32
Performance
This year, 95.4% of Queensland Year 9 students scored at or above the national minimum
standard (Band 6), compared to 95.5% of Australian students overall. Queensland scale score
shows a steady trend for improvement over the past decade:
The percentage of students who correctly answered Calculator-allowed items ranged from 93%
(Item 1) down to 5% (Items 37 and 40). Item 37 required students to use knowledge of the
properties of complementary events and subtract their probabilities to deduce the probability of a
stated event. The other most difficult question for students, Item 40, involved extrapolating data in
a table involving two separate rates, one decreasing and one increasing, to determine the profit
made on selling several items.
For the Non-calculator items, the percentage of students answering items correctly ranged from
76% (Item 1) down to 8% (Item 8). Item 8 required students to determine the number of grams of
copper, to be expressed as a decimal, which is found in 8 grams of an alloy. The composition of
the alloy was presented as a three-part ratio involving three metals.
Twenty-one of the 40 Calculator-allowed items were answered correctly by 50% or more of
Queensland students. Similarly, four out of the 8 Non-calculator items were answered correctly by
50% or more of the Queensland students.
There was a difference of more than 4% between the facility rates (the percentage of correct
responses) of the national cohort and the Queensland cohort of students on only two Items (26
and 29). In Item 26, Queensland students performed below the national cohort by 4.3% and in
Item 29 Queensland students were 4.2% lower than the national mean. Item 26 was from the
Measurement and Geometry strand. Students had to calculate the circumference of a drum,
given its diameter in centimetres, to one decimal place. Item 29 was from the Statistics and
Probability strand. Students needed to take a proportion from a sample of students and apply this
to the whole population.
Queensland students performed equal to or above the national cohort on four items of the
Calculator-allowed paper: Items 1, 16, 22 and 31. In the Non-calculator paper, Queensland
students performed below the national cohort on all eight items. However, the difference in facility
rates was small (less than 4%).
While the majority of students attempted to answer all questions, many students omitted the more
challenging items towards the end of each paper. The items with more than a 25% omission rate
were items 33, 35, 37, 38, 39, and 40 in the Calculator-allowed paper and Item 8 in the Non-
calculator paper. All of these items required a constructed response rather than selecting an
answer from given multiple-choice options. These items are designed to differentiate student
performance — to provide opportunities for higher performing students to demonstrate their ability
to reason mathematically. Item 35 had the highest omission rate of 35%. It involved a tree-
diagram with four branches, the probabilities on each branch were stated as fractions with
different denominators and the answer was to be expressed as a decimal to two decimal places.
This item required students to interrogate several pieces of written information regarding a
swimming team with two coaches, the number of swimmers they each train and the random
selection of one swimmer into a swimming team.

Implications for teaching


Over the entire Numeracy test of 48 items, there were only four Items (22, 28, 32 and 36 of the
Calculator-allowed paper) that could be considered as purely mathematical. The other items
presented problems set in real-life contexts. Therefore, it is important that teachers provide
learning activities and tasks that, where possible, allow students to investigate mathematics and
develop solutions to problems that are set in real-life contexts. It is also recommended that

NAPLAN Queensland Curriculum & Assessment Authority


2018 State report – Year 9 February 2019
Page 29 of 32
teachers seek to combine mathematics learning from multiple strands. This will enable students
to make connections between different areas of mathematics.

Non-calculator items
In each of the 8 items in the Non-calculator paper, Queensland students performed below the
national cohort of Australian students. This suggests that working through mathematical problems
without using calculators is an area where Queensland students could improve.
There are times when the calculations required in a problem are too complex or the values
involved have many decimal places. In these instances, a calculator may be essential. However,
there are many situations where a calculator is not needed. For example, in straightforward
number problems and patterns, in some problems involving fractions and percentages, and in
money problems where change or discounts have to be calculated.
Where appropriate, teachers should encourage their students to perform the calculations required
in solving mathematics problems without using calculators, at least initially. Students could then
be encouraged to use calculators as a means of checking their own manually obtained answers.
This practice would also be beneficial for students as a life-skill, as a means of estimating
answers or quantities and for those times when a calculator is not available. Another benefit from
manually calculating answers would be to help students decide on the reasonableness of
answers that are obtained when they do use a calculator.

Tables, graphs, diagrams


In the 2018 Year 9 Numeracy test, most of the items in the Calculator-allowed paper (56%),
required the interpretation of tables, graphs, diagrams or other images. Students’ visual literacy
influences their ability to make sense of mathematical information presented in different
representations.
It is important that students be given frequent opportunities to interpret and construct a variety of
graphical displays including pie graphs, column graphs and line graphs.
Students also need to be able to interpret various data displays, such as stem-and-leaf plots, data
tables and Venn diagrams. It is also recommended that students become familiar with using tree
diagrams and interpreting timetables.
Many of the diagrams in NAPLAN tests are geometric in nature. They often involve nets,
perspectives, lengths, perimeters, areas, volumes, grids, scales, angles, locations and directions.
Students need exposure to learning experiences that involve a wide variety of geometric
representations.
The ability to recognise and interpret the different ways that data and information can be
represented is an important element of numeracy. Teachers can lead class discussions about the
visual representations of mathematical concepts and provide opportunities to engage with a wide
variety of mathematical diagrams. This will help students develop the skills and experience
required to interpret them. Even questions without visual stimulus may be conceptualised by
sketches, diagrams or models.

Word problems
In the 2018 Year 9 Numeracy test, the Non-calculator items were mostly word problems while
44% of the Calculator-allowed items were word problems; overall, this is a significant proportion
of the test. Therefore, it is important that students have frequent exposure to mathematics
problems presented in sentence form, interspersed with numbers. Students need to practise
reading written text in numeracy contexts, interpreting the information and deciding what is
required to be done.

NAPLAN Queensland Curriculum & Assessment Authority


2018 State report – Year 9 February 2019
Page 30 of 32
Many students find word problems challenging as they require reading, interpreting and deciding
what procedures to use. Students need exposure to a wide range of word problems that involve
different combinations of mathematical operations. They need to become familiar with the
language associated with operations and the terminology that is used when dealing with word
problems. Money-related problems, for example, lend themselves to written descriptions.
Consequently, students need to have a good grasp of the language of money, e.g. terms such as
discount, profit, loss, sale price, percentage increase and decrease.
When dealing with a word problem, students should be encouraged to read the whole question
more than once. The first time to get a general idea of what it is about, with subsequent readings
to identify important information and what the question is asking to be done. Circling, underlining
or highlighting key information may help to identify important details for students. Sorting written
information into a more useful form such as drawing a sketch or diagram, a table or a list can be a
valuable habit for students to adopt. The ability to translate word problems into arithmetical or
algebraic expressions or equations is also an important element in solving word problems.

Problem solving
Students need frequent exposure to challenging problems to develop the confidence and skills to
find the solutions. Teachers should consider teaching and learning through presenting students
with problems that range from simple to complex, familiar to unfamiliar and single-step to
multistep. Problem solving should be incorporated into learning activities, not treated as a
separate or add-on activity. It should be included in a range of learning experiences, both in real-
life and purely mathematical contexts. Where possible, teachers are encouraged to pose student
maths problems that are derived from different curriculum areas.
It is possible to equip students with approaches to apply when a numeracy problem is
encountered. This will give them confidence to attempt and develop solutions. Students can be
equipped with problem-solving approaches where they feel enabled to:
 Identify the problem (What am I being asked to do?)
 Analyse the problem (What do I have to work with?)
 Take appropriate action (Select and apply procedures to solve the problem)
 Reflect on the answer (Check my answer — Does it work? Is my answer reasonable? Are
there other correct answers?).
While all steps are important, ‘Does my solution work?’ and ‘Is my answer reasonable?’ are
particularly beneficial in test situations, especially when students are asked to construct a
response.
Students should be habitually asked to consider the reasonableness of their answers. See for
example Calculator-allowed Items 8, 26, 33 and 38 where the constructed responses could
readily be checked for reasonableness. Knowing a problem-solving strategy will help when
students are required to provide constructed responses. The items with the highest omission
rates in the 2018 Numeracy test were those involving constructed responses.

Test-wiseness
Teachers should instruct students to answer all multiple-choice items. Even if this involves
guessing some answers before the test time runs out, it is better practice to answer these items
than not answering at all. An educated guess may sometimes be correct.
As well as the multiple-choice questions, students should expect constructed-response items to
be included in the test. They need to have the confidence to commit to providing their own
numerical answers. Students can develop their mathematical confidence by regularly engaging

NAPLAN Queensland Curriculum & Assessment Authority


2018 State report – Year 9 February 2019
Page 31 of 32
with a wide variety of tasks that require them to produce numerical responses, either
independently or in groups.
The low facilities and high omission rate for word problems can be improved. The literacy
features that are typical of numeracy contexts should not be unfamiliar or daunting to students.
Literacy teaching should not be confined to the English class.

Difficult items
Teachers are encouraged to closely inspect the items that Queensland students found the most
difficult. Generally, this would be the items where the Queensland facility rates were below 30%.
Teachers should consider whether it was the content covered and/or the style of these questions
that made them difficult and what the implications are for their students.

QCAA resources
Full analysis of student performance and error patterns for each item is published in the
SunLANDA program: https://www.qcaa.qld.edu.au/p-10/naplan/test-reporting-
analysis/sunlanda/accessing-navigating-sunlanda
and as PDF documents: https://www.qcaa.qld.edu.au/p-10/naplan/test-reporting-analysis/test-
item-analysis.
A useful reference for the teaching of spatial reasoning and geometric properties is given here:
QSA, 2005, Mathematics: About space,
https://www.qcaa.qld.edu.au/downloads/p_10/kla_maths_info_space.pdf

NAPLAN Queensland Curriculum & Assessment Authority


2018 State report – Year 9 February 2019
Page 32 of 32

You might also like