Get all your news in one place.
100’s of premium titles.
One app.
Start reading
The Conversation
The Conversation
Lifestyle
Helen Georgiou, Senior Lecturer in Science Education, University of Wollongong

Are Australian students really falling behind? It depends which test you look at

Ask anyone about how Australian students are doing in school and they will likely tell you our results are abysmal and, more importantly, getting progressively worse.

This narrative has been reinforced by sustained reporting within academia and the media. It has only grown with the release of the 2022 Programme for International Student Assessment (PISA) results on Tuesday evening.

But is this accurate and fair?

This year we independently both published papers looking at Australian students’ results. These papers both reached the same conclusions: students’ scores on the vast majority of standardised assessments were not in decline.


Read more: Australian teenagers record steady results in international tests, but about half are not meeting proficiency standards


What tests do Australian students do?

Australian students sit multiple standardised tests. These are tests that are set and scored in a consistent manner. Importantly, scores from one assessment round are statistically “matched” with those from previous rounds, meaning comparisons of average scores over time are possible.

Australian students do NAPLAN in Year 3, Year 5, Year 7 and Year 9. This is a national test that looks at literacy and numeracy skills.

Australian students also sit several international tests. PISA aims to measure 15-year-old students’ application of knowledge in maths, science and reading.

They also sit Progress in International Reading Literacy Study (PIRLS) which looks at Year 4 students’ reading comprehension skills and Trends in International Mathematics and Science Study (TIMSS), which assesses maths and science knowledge in the curriculum in Year 4 and Year 8.

NAP-SL measures students’ science literacy in Year 6 and Year 10. NSW students also complete Validation of Assessment for Learning and Individual Development (VALID) assessments in science based on the NSW syllabus in Year 6, Year 8 and Year 10.


Read more: Australia’s Year 4 students have not lost ground on reading, despite pandemic disruptions


Sally’s research

Sally’s research documented average scores in the four major standardised assessments in which Australia’s students have participated since 1995.

All but one assessment program (PISA) showed improvements or minimal change in average achievement.

In particular, primary school students’ scores in some of the standardised literacy and numeracy tests, including NAPLAN, PIRLS and TIMSS, have notably improved since the start of testing in each program.

For example, for PIRLS, which tests Year 4 reading skills, the average score for Australian students increased from 527 in 2011 to 544 in 2016 and 540 in 2021 (the difference between 2016 and 2021 is negligible).

Since NAPLAN testing began in 2008, average Year 3 reading achievement has increased by the equivalent of a full year’s progress.

In high school, students’ NAPLAN and TIMSS results have stayed largely the same over the same time span.

Helen’s research

Helen’s research explores the assumption there is a real and significant decline in Australian students’ achievement in science. It looks at assessments of students’ science literacy, including PISA, TIMSS, NAP-SL and VALID.

NAP-SL has no historical data but between the other three assessments, there is only a decline for PISA.

For both TIMSS and VALID, average scores remain stable, though TIMSS reveals improvements during the period PISA scores appreciably decline. Analysis on PISA scores for NSW public school students also reveals no decline.

What does this mean?

So when we talk about a “decline” for Australian results, we are really just talking about a decline in PISA results. While these do indeed show a decline, there are other important factors to consider.

First, PISA is one of many assessments taken by Australian students, each providing important but different information about achievement. As 2023 research also shows, PISA receives a lot more attention than other international tests. While there is no definitive reason for this, researchers suggest

the OECD purposefully set out to [give it more attention], branding and marketing the study in such a way to maximise media, public and policy attention.

A 2020 paper also noted the “growing body” of criticism around PISA.

This includes doubts over whether PISA actually measures the quality of education systems and learning, or if it measures something distinct from existing tests.

Comparing scores and ranks is also highly problematic because countries’ scores are not exact. For example, in 2018, Australia’s reading literacy score (503) was considered “not statistically different” from ten other countries, meaning its rank (16th) could potentially be as high as 11 or as low as 21.


Read more: Problems with PISA: Why Canadians should be skeptical of the global test


Why we should be cautious

Australia needs to be cautious about an over-reliance on PISA results.

For example, last month a widely publicised report from educational consultancy Learning First called for an overhaul of Australia’s science curriculum. In part, it based its argument on “deeply disturbing trends” around “sliding performance” on declining PISA results.

So we need to be careful about what these results are used for and how they may be used to justify big changes to policy.

Perhaps most importantly, however, is that the decline narrative diminishes and minimises the difficult and amazing work teachers do. While improvement should always be on the agenda, we should also celebrate our wins whenever we can.


Read more: The Australian Curriculum is copping fresh criticism – what is it supposed to do?


The Conversation

Helen Georgiou currently receives funding from the NSW Department of Education and The Australian Government (Department of Industry, Science, Energy and Resources)

Sally Larsen does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.

This article was originally published on The Conversation. Read the original article.

Sign up to read this article
Read news from 100’s of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.