Louise Courtney

Senior Research Fellow - Australian Counci for Educational Research
1576150820 V2 Photo CFP 64

Louise Courtney

Senior Research Fellow - Australian Counci for Educational Research

Biography

Louise Courtney, Senior Research Fellow at the Australian Council for Educational Research, has worked for ten years in the area of Language and Literacy within the department of Assessment and Psychometric Research. Recent presentations at EALTA Dublin May 2019, and Inclusion, Mobility and Multilingual Education Conference in Bangkok September 2019.

Co-Author: Nathanael Reinertsen

Not a selfie; a big picture. An evidence-based snapshot of literacy levels across South East Asia.

Governments and aid organisations join forces to measure student reading and writing levels across South East Asia. The South East Asian Primary Learning Metric (SEAPLM) is the first assessment to measure literacy in a variety of languages used in the region. The most exciting result is the possibility of comparing student writing in different languages using the same scale.

Not a selfie; a big picture.
Assessing literacy across multilingual contexts in Southeast Asia
Louise Courtney & Nathanael Reinertsen, Australian Council for Educational Research (ACER)
After field trials in 2014-18, the Southeast Asian Primary Learning Metric (SEA-PLM) assessment program completed its main survey administration in eight languages across six countries in Southeast Asia in 2019. SEA-PLM is the first regional assessment of its kind, designed to reflect Southeast Asian students’ contexts.
Ability in reading, writing, mathematics and global citizenship was measured in Year 5 students in each participating country. This presentation describes the scale of the assessment and how it can be used to inform teaching and learning across the region. It describes the process that led to development of this unique assessment that has the potential to unite countries in their quest for improved literacy in the region.
Reading tasks ranging from very simple through to more complex were presented. They have an authentic “literacy focus” to reflect student needs, tailored to suit local contexts.
The Writing instrument, in particular, is entirely novel as there has never been a cross-language Writing instrument administered across the Southeast Asian region. Indeed, there have been very few other international assessments that have attempted to directly compare Writing Literacy across multiple languages.
This paper hypothesises that some features of Writing are comparable across languages, while others are language specific. If supported by the data from the 2019 main study, this hypothesis provides the foundation for an argument that cross-language comparisons of Writing Literacy are possible by using generic criteria for scaling.
This would mean that the SEA-PLM Writing Instrument can solve a key challenge to assessment of Writing Literacy in a multilingual context: how to place students’ writing in different languages on the same scale.