Measuring student progress judiciously
Last week students across the School participated in our annual NAPLAN testing. I have previously used this column to discuss the merits and detriments of NAPLAN and explained how we utilise NAPLAN data as one of many indicators of student and cohort progress.
The utilisation of data-based evidence to inform changes in teaching is a practice that is being undertaken with greater frequency across the educational industry. I believe that this is an overwhelmingly positive development, as an analysis of data allows us to more objectively examine approaches and to avoid reliance on pre-determined assumptions in honestly appraising our strategies and their efficacy.
When used well, student outcome data can help us hone in on an area requiring further development, can identify a gap in teaching programs or can lead us to seek more diagnostic analysis when a surprise outcome is generated.
A challenge with a data-driven approach, however, is that it is only as good as the data we have access to. Mark Twain identified how data can be misused to fit a particular narrative. He popularised these famous words which he attributed to Benjamin Disraeli: “There are three kinds of lies: lies, damned lies, and statistics.”
In my view, a serious problem across global education is that we frequently rely on data sets that are drawn from tests which are not particularly well designed to inform what we are examining. For instance, in studies which analyse the successes of various kinds of teaching pedagogy we may find reference to the sorts of standardised testing which prioritise skill sets not preferenced through such teaching styles. How instructive is it to measure a student who has been taught problem solving, collaboration and creativity, through closed-option, multiple choice-based tests?
This is especially problematic as we tend to venerate information which we can draw back to a data source. This is particularly the case in societies where standardised testing carries significant political power. In certain states in the USA, school funding is closely linked to high-stakes tests and so schools are inclined to avoid any programming, no matter how meritorious in terms of a child’s holistic development, if it is not linked directly to achieving higher marks on the test.
Thankfully, as education systems embrace teaching broader skill sets, testing design is being adjusted to assess more than numeracy, literacy, comprehension and recall. An example is found in adaptations to the Programme for International Student Assessment (PISA) test content. PISA has, for the first time in 2022, adopted a creative thinking test to sit alongside its mathematics test and has alerted participating countries that in 2025 it will incorporate Learning in the Digital World which “aims to measure students’ ability to engage in self-regulated learning while using digital tools.”
At King David we judiciously use data to help guide our decision making. We are careful to critically analyse data sources to ensure that there is veracity in the information that we are gathering. Through utilising a range of assessment types and teacher observations, we are able to gain a snapshot of each student and cohort performance. In this way our collation and analysis of student data enables us to adjust our learning design to best meet the needs of our students.
Shabbat Shalom,
Marc Light
Principal