Chris Bonnor, Bernie Shepherd. School equity since Gonski: how bad became worse.

Sep 18, 2014

This is a shorter version, prepared for Pearls and Irritations, of a paper which was reported in the Sun Herald on September 14 Go to http://www.smh.com.au/national/education/schools-worse-now-than-when-gonski-wrote-report-20140913-10gepz.html A longer version, including graphics, is available at https://drive.google.com/file/d/0BxK25rJrOw-eVU4zM2p2UTF5ZkE/edit?usp=sharing

 

The story of the Review of Funding for Schooling, otherwise known as the Gonski review, is well known. The Review began in 2010 and its report, with its significant findings and recommendations, was handed to the Gillard Government at the end of 2011.

Over the last three years we have seen the report and its promised pathway to greater equity and achievement in Australia’s schools fall victim to a combination of timidity, inaction, distortion, self-interest and partisan politics. The loading for low SES funding is currently being “reviewed” in what seems to be yet another step in watering down the equity intention of Gonski’s recommendations.

Something else happened in 2010: the My School website was launched – and the data underpinning the site tells a compelling story, not so much about individual schools, but collectively about our framework of schools – what it delivers and more importantly, what it doesn’t.

In effect, the most substantial review of schooling ever conducted in Australia was accompanied by this gold mine of information which, over time, would tell if the reviewers got it right and identified real problems and solutions. Have the problems revealed by Gonski diminished or gone away in just three years?

They didn’t go away. Many just got worse.

What Gonski found

Two simple statements sum up the purpose and challenge of the Gonski Review. They are expressed in the first and second findings of the review. In effect, we need to lift performance and facilitate this by directing resources to where they are needed.

The Gonski panel reached this conclusion after considering data from the Programme for International Student Assessment (PISA) and Trends in International Mathematics and Science Study (TIMSS) and to some extent from NAPLAN. But in 2010 NAPLAN data had only been around for a couple of years. It could not show, to any extent, the longer term perspective in results.

It does now, and what it shows is very revealing.

  1. Student performance

My School reports student performance in the four NAPLAN Aspects: reading, writing, language conventions and numeracy. Student test scores in these domains have to be interpreted cautiously. The nature of the NAPLAN writing test, for example, has changed over this time. Even on a national level changes in test scores from year to year might have many explanations.

Details of changes in test scores, along with some of our cautions are included in the longer version of this paper. Our conclusion, on examining NAPLAN results is that performance overall has been more inclined to stagnate or fall rather than to improve, with trends in Year 9 of particular concern. What is certain is that there is no substantial evidence of any performance lift: the Gonski Review panel had every right to be concerned.

But Gonski was concerned about equity as well as overall student achievement, hence the need to dig deeper to see if trends in student achievement have varied according to the SES of the school (represented on the My School website by school ICSEA values).

To find out more we investigated changes in reading and numeracy scores between 2009 and 2013 for schools:

  1. in a high ICSEA range, between 1150 and 1250
  2. in a middle ICSEA range, between 950 and 1050
  3. in a low ICSEA range, up to 750

A couple of examples illustrate the most noticeable trends.

Year 9 reading scores in the high ICSEA schools increased from 625 to 641. They remained largely unchanged for the middle ICSEA schools, but fell from 453 to 446 in the lower ICSEA schools.

Year 9 numeracy scores in the high ICSEA schools increased noticeably from 649 to 688. They drifted down in the middle ICSEA schools, but again fell (from 484 to 445) in the lower ICSEA schools.

Some results fluctuate from year to year, and the issue of statistical significance is pertinent. But some trends are reasonably consistent: test scores for high ICSEA schools have trended upwards – and remained static or trended downwards for middle and low ICSEA schools. It seems the Gonski Review had good reason to be concerned about Year 9. 

At the very least, the equity implications of trends indicated by changing NAPLAN scores needs further and urgent investigation, particularly in the light of current and controversial moves to change the basis on which needs funding is allocated[i]. 

  1. Worsening equity

A widening achievement gap, between those already advantaged and those not, strongly suggests a serious and worsening equity problem. Is there any other data available in My School to support this assumption?

Gonski explored the influence of student background on educational outcomes, as seen in what is known as “social gradient” measures. These can be derived by measuring the slope of a graph of educational outcomes against some social or socio-economic indicator. The Gonski final report included a graph[ii] showing how Australia has a steeper social gradient compared with many other countries. A steeper slope indicates a greater impact of social factors – as distinct from school factors – on student achievement.

The Gonski panel concluded that achieving greater equity and improvements in student outcomes required effort to reduce the influence of student background on achievement, in effect to reduce the social gradient. The My School data provides the opportunity to examine a kind of social gradient within NAPLAN performance if we plot schools’ average NAPLAN scores against the schools’ ICSEA values. Since ICSEA is a socio-educational advantage measure, we might call it a socio-educational gradient (SEG). Typical values for the slope of these NAPLAN/ICSEA plots are around 0.35, or 35%.

By calculating SEGs for various groups of schools we are able to compare the equity of schooling in different places and for different levels of schooling.

Can socio-educational gradients indicate changes in socio-educational impact over a period of time? Here are the gradient changes for various groups of schools in Australia for 2010, 2011, 2012 and 2013.

All schools                            32% 33% 35% 37%

Primary schools                   28% 30% 33% 35%

Combined schools               36% 35% 37% 38%

Secondary schools              40% 43% 44% 43%

Metropolitan schools        31% 34% 36% 38%

Provincial schools             27% 27% 31% 33%

Remote schools                  29% 30% 33% 35%

Very Remote schools        38% 35% 39% 37%

The data for all schools shows that Australia’s socio-educational gradient has steepened from 32% in 2010 to 37% in 2013 – socio-educational advantage has had an increasing impact on student achievement in just four years.

As the data indicates, gradients are very much steeper (40%-44%) among secondary schools than among either combined or primary schools. In addition, gradients are higher among metropolitan schools generally – and the change over time is greater – than among non-metropolitan schools.

These are not changes measured across decades, they are measured across the very same years that the Gonski review proceeded, reported, was variously ignored, cherry-picked, partially implemented, then in relative terms largely abandoned – with significant state exceptions. The problems highlighted by Gonski didn’t go away. We haven’t lifted performance, the gap between our advantaged and disadvantaged has widened and we are increasing the impact of differences in wealth, income, power or possessions on opportunities for our students.

  1. Where the money goes

During the years up to and including 2013 the funding of schools continued under a funding system described by the review as lacking logic, consistency and transparency. Has the distribution of resources changed since Gonski reported and, if so, in what ways? One way to find out is to analyse the funding received by students in three distinct ICSEA groups of schools:

  1. Schools at ICSEA 900 (+/- 10)
  2. Schools at ICSEA 1000 (+/- 10)
  3. Schools at ICSEA 1162 (+/- 10)

Students in schools around the 900 ICSEA level are certainly disadvantaged. Their NAPLAN scores are generally quite low at an average of 460 on an aggregated measure. Again on average, $13 870 is spent on each student. Almost all this money, regardless of school sector, comes from governments.

In schools around ICSEA 1000 the aggregated NAPLAN scores are considerably higher at 495. On average $11 265 is spent on each student – less than that available to the more disadvantaged students. Most of this funding, between 84% and 100%, is also provided by governments.

Considering just these two examples, we can say that more resources are being directed to where the need is greater. The resources may or may not be sufficient to substantially improve student outcomes, but for the moment that’s another story.

The third group, around ICSEA 1162 is chosen for an interesting reason. This is where students, who at these levels are relatively advantaged, are funded at higher levels ($14 263 per student) than the disadvantaged students in the ICSEA 900 schools. Between 65% (Independent schools) and 82% (Catholic schools) of their funding still comes from government sources. Students in non-government schools above ICSEA 1162 are funded (in total) at even higher levels, the funding increasingly coming from school fees in addition to funding from governments. At ICSEA 1200, governments are still funding Catholic schools at 83%, with Independent schools at 47%.

Gonski accepted the arguments for some public funding for students regardless of their levels of advantage. However, if all sources of funding are considered – as was Gonski’s brief – we certainly don’t always direct our public and private resources towards the greatest need.

The problem doesn’t strictly lie with the distribution of government funding (combined state and federal), which in general does favour our most disadvantaged students in all sectors. The problem lies in the amounts and distribution of the total mix of funding in the case of higher ICSEA schools. This is a unique Australian problem which sees non-government schools funded, in often quite complex and poorly co-ordinated ways, from a variety of sources. The result is that the country as a whole, including governments, often invests more on the most advantaged students than it does on the needy.

It is hardly surprising that this situation opened a long-overdue debate, skilfully handled by the Gonski review, about how to create greater equity across all schools, while taking all sources of funding into account.

Conclusion 

The Gonski review found that Australia needed to lift the performance of students at all levels of achievement, particularly the lowest performers. That wasn’t happening up to 2010 when the panel began its deliberations. It wasn’t happening when it handed down its report and it seems it is still not happening.

The review found that Australia needed a funding model that adequately reflects the different needs of students to enable resources to be directed to where they are needed most. It still does; resources are distributed less equitably than in 2010 – redirecting resources to where they are needed most is an even more urgent priority.

What Gonski found to be bad, we find to be worse. Can the scaled-back implementation of Gonski – now just beginning and stretching a much lower investment over just four years – really make the much needed difference?

Chris Bonnor and Bernie Shepherd are retired secondary schools principals. Bernie was formerly Principal of St Mary’s High School. Chris was President of the NSW Secondary Principals’ Council.

 

 

 

 

 

[i] http://www.theaustralian.com.au/national-affairs/education/review-of-extra-funding-for-lowses-students-rigged/story-fn59nlz9-1227051910709#

[ii] Review of Funding for Schooling Final Report December 2011. Page 7

Share and Enjoy !

Subscribe to John Menadue's Newsletter
Subscribe to John Menadue's Newsletter

 

Thank you for subscribing!