‘In-school-variation’: Governors and Data

30th September 2012

Tomorrow I will be doing something I think I should do more often. I will be talking to school governors about data and helping them understand ‘in-school-variation’.
My research on school governance showed that being able to look at data critically was one of the things that sets good governors apart and which they most struggle to do. It’s not surprising. There are plenty of Heads and teachers who find getting to grips with Raisonline tough, so how hawkeyed can we really expect overstretched volunteers to be?
For the school I’ll be visiting, the motivating factor behind asking me to do this is the increasing focus on in-school-variation brought about by changes to DfE Performance Tables and the new Ofsted framework. For me there is a different reason.
The school I’ll be visiting has made huge and rapid improvements, going from 17% 5A*-C including English and Maths in 2006 to almost 80% this year. For a school like this, maintaining these improvements is a real challenge. They need to identify and address the needs of their “tail”- those pupils who are still lagging behind. Understanding who’s in the tail is therefore crucial. At this school, each group (apart from one) is achieving above national averages and given that each cohort if of only around 100 pupils, the size of sub-groups makes it hard to spot trends; indeed, even a close reading of Raisonline does not always reveal what is going on. What governors therefore need is:
– Information presented in clear, visual ways, revealing the issues without the need for specialist knowledge or careful analysis.
– A link between attainment and progress. There are always groups that come in at a lower level, if they then progress faster, the school is closing gaps and doing right by its pupils. However if pupils in low performing groups are also progressing slower, then the gap is widening. Governors need to know how group attainment compares to group progress and hence whether the school is closing the gap or not.
– Information about whether they are just looking at information about a few pupils last year or whether these pupils exemplified a recurring trend over time.
I’ll address these issues tomorrow by:
1. Displaying groups’ attainment relative to the school average. This will shift governors’ attention to in-school-variation and away from older and more deterministic CVA measures. I include a range of measures to show that there is consistency across them rather than because governors necessarily need to consider each indicator.
data_graph_1
2. I then replicate the graph with progress data – this means that any recurrence of a particular group at the lower end shows there is an issue to be addressed.
data_graph_2
3. Finally given that we were dealing with small groups sizes I want to show whether there are recurring trends over time – I’m all too aware of ‘comedy’ Ofsted moments when inspectors ask “Why did 25% of your white boys do/not do x” when 25% of your white boys equals one boy who was in hospital at the time. I’ll therefore use what I call a “gap map” which traffic lights a range of measures (including progress and attainment) over several years so that governors can see which issues are recurring. The beauty of this is that they don’t need to go into the detail of what each measure is- they can just look at where red and orange squares are concentrated.
gap_map_e.g
I’ve also run some analyses of tracking data to highlight any possible trends lower down the school.
Given the complicated leg work that has to be done first, it is hardly surprising that governors can find it hard to get to grips with the information they need. However, armed with this type of information, it is much easier for them to ask critical questions and get into strategic discussions about what the school’s priorities need to be. Of course, some schools have fantastic data managers or Heads of Teaching and Learning who do this work for them; however that’s harder in smaller schools. There’s also a problem with relying on leadership teams to provide the people who are holding them to account with the information with which to hold them accountable. It’s an issue that comes up several times in my report. It was previously (at least partly) addressed by School Improvement Partner but this is no longer the case. Yet, ultimately, the quality of governance cannot exceed that of the information on which governors base their questions.
By-the-by, something else that came out of my analysis: 77% of the pupils who did not get 5 A*-Cs including English and Maths at this school only did so because of English. No prizes for guessing whether they were entered for January or June…
You can read a summary of our 2011 report for Teach First on what makes a good governor below. The full report is available here