OECD Says 3 in 4 Australian Students Do Not Try on PISA Tests

Dec 22, 2019

The hand-wringing over the continuing decline in Australia’s PISA results misses the issue of whether students try their best on the tests. The OECD’s report on PISA 2018 shows that about three in four Australian students and two-thirds of students in OECD countries did not try their hardest on the tests. There are also wide differences between countries. It has potentially explosive implications for the validity of international comparisons of student achievement based on PISA.

The report shows that 68% of students across the OECD did not fully try. In Australia, 73% of students did not make full effort. This was the 14th highest proportion out of 36 OECD countries. The report also shows large variation in student effort across countries. Around 80% of students in Germany, Denmark and Canada did not fully try compared to 60% in Japan and 46% in Korea.

The report adds to extensive research evidence on student effort in standardised tests. Many overseas studies over the past 20 years have found that students make less effort in tests that have no or few consequences for them. For example, a study published last year by the US National Bureau of Economic Research (NBER) based on data from PISA 2015 found that a relatively high proportion of students in Australia and other countries did not take the tests seriously.

Less effort in tests leads to lower results. As the OECD report on PISA 2018 states: “differences in countries’ and economies’ mean scores in PISA may reflect differences not only in what students know and can do, but also in their motivation to do their best” [p.198].

There is no direct evidence of declining student effort in PISA as a factor behind the large decline in Australia’s PISA results since 2000. However, there is evidence of increasing student dissatisfaction at school which might show up in reduced effort and lower results.

Successive PISA reports show that student dissatisfaction at school amongst 15-year-olds in Australia has increased significantly since 2003. The proportion of students who feel unconnected at school increased by 24 percentage points from 8 to 32% between PISA 2003 and 2018. This was the 3rd largest increase in the OECD behind France and the Slovak Republic. In PISA 2018, Australia had the equal 4th largest proportion of students who feel unconnected with school in the OECD.

The large increase in student dissatisfaction at school in Australia may have led to lower motivation and effort in PISA over time. The OECD says that the relationship between a feeling of belonging at school and performance in PISA is strong for those students with the least sense of belonging [OECD 2016, p. 122]. Students who feel they do not belong at school have significantly lower levels of achievement in PISA than those who do feel they belong.

An issue with the OECD data on student effort in PISA is that it is based on student self-reporting. There are well-known problems with self-reporting such as how truthful students are about their effort and the extent to which answers provided on subjective response scales can be compared across students and across countries.

The OECD report also draws on other methods to measure student effort based on student behaviour in computer-based tests including measures of “response-time effort”, “test endurance” and the proportion of items not reached in the tests. For example, the response-time effort measure suggests that students made more effort on PISA than indicated by self-reports.

However, this measure could under-estimate the extent to which students do not fully try because it makes the arbitrary assumption that students who spend more than five seconds on a test item are making a genuine effort. This is not necessarily the case as students who spend more time on an item could be just killing time or “switching off” rather than trying to answer.

The OECD report is inconclusive about the extent of the effect of student effort on test results and country rankings. However, it does acknowledge that differences in student effort across countries will affect country results and rankings and this is supported by other recent research evidence.

The NBER study noted above also used response time as the measure of student effort and found much lower levels of students not fully trying in PISA 2015 than the self-reporting in 2018. It found that the proportion of non-serious students varies enormously by country from 14% in Korea to 67% in Brazil. It rated 23% of Australian students as non-serious compared to 15-18% in high achieving countries such as Finland (15%), Japan (18%), Korea (14%) and Singapore (17%).

Even these lower proportions of students not fully trying in PISA 2015 had a large impact on the rankings for several countries. For example, the study estimated that Portugal’s ranking in science in PISA 2015 would have improved by 15 places from 31st to 16thif students had fully tried. Sweden’s ranking would have improved 11 places from 33rd to 22nd and Australia’s ranking by four places from 16th to 12th.

There are massive implications for the reliability of PISA if the proportions of students not fully trying are higher than indicated by the NBER report and more like those self-reported to PISA. Actual scores are likely to be significantly under-estimated, the declines in scores over-estimated and country rankings will be massively distorted. It calls into question the validity of PISA as an accurate representation of student achievement within and across countries.

The possibility that student effort on PISA has declined helps explain the contradiction between Australia’s PISA and Year 12 results. Some 75-80 per cent of Australian students participating in PISA are in Year 10. It is perplexing that the PISA results for these students have declined since 2000 while results for students two years later in Year 12 have improved significantly.

The percentage of the estimated Year 12 population that completed Year 12 increased from 68% in 2001 to 79% in 2017 [Report on Government Services 2007 & 2018]. Also, a larger proportion of disadvantaged students now complete Year 12. For example, the percentage of low socio-economic status students who completed Year 12 increased from 64% in 2001 to 76% in 2017.

OECD data also shows that Australia had one of the larger increases in the OECD in the proportion of 25-34 year-olds who attained an upper secondary education. It increased by 18 percentage points from 71% in 2001 to 89% in 2018 compared to the OECD average increase of 11 percentage points [Education at a Glance 2002 & 2019].

These are indicators of an improving education system, not a deteriorating one. Part of the explanation for the differing results in PISA and Year 12 is that PISA has no consequences for students or their schools – students don’t even get their individual results. In contrast, even disconnected students have a greater incentive to try hard in Year 12 because it has a major influence on future careers and lives.

Thus, there is credible evidence that at least a significant proportion, if not a large proportion, of Australian students did not fully try on PISA 2018. It is likely to be a factor, among others, contributing to Australia’s relatively poor performance compared to other high achieving countries. It is also possible that increasing student dissatisfaction since 2003 has led to an increasing proportion of students not fully trying, thereby contributing to the decline in Australia’s results. It suggests that the doom and gloom about Australia’s latest PISA results are misplaced and any policy responses should be based on a more comprehensive analysis of the factors behind these results.

Trevor Cobbold is National Convenor of Save Our Schools.

Share and Enjoy !

Subscribe to John Menadue's Newsletter
Subscribe to John Menadue's Newsletter

 

Thank you for subscribing!