Analysing and using data

Analysing data

Methodical analysis of assessment data provides the evidence a practitioner needs to improve teaching and learning for the group and individuals within it. Accurate interpretation of the data analysis enables the practitioner to understand where learners are in the learning and to set the goals and learning inten​tions for the next steps in the learning process and plan the learning program.

​​​​Analysing assessment dat​a

Any one assessment should not be used in isolation to determine the achievement level of a learner. A range of assessments will provide more reliable evidence of learning from which the practitioner can make an on-balance judgement about learner achievement.

Including learners in the analysis of their assessment results encourages them to take ownership of their learning and adds motivation for them to set their own lea​rning intentions.​

Sources of data

In the classroom

Most assessment data is collected by practitioners as part of their regular practice through questioning, observation, discussion, tests, and projects.

​On the large scale

Large scale assessments, such as the National Assessment Program – Literacy and Numeracy (NAPLAN), the Victorian Certificate of Education (VCE) and the Australian Early Development Index (AEDI) provide a broader context for practitioners and leaders to analyse what they know about their learners. Practitioners and leaders use large-scale assessment data to improve their practices and review and enhance programs within their communities.

Knowing cohorts, knowing learners

Analysis of assessment data generally occurs at a cohort or individual level. Practitioners who understand their cohorts of learners and each individual learner develop effective learning programs for the whole group and individual learners.

Analysis of responses for each item in a practitioner created assessment task and in standardised assessments such as NAPLAN and VCE, are helpful for identifying misunderstandings or knowledge gaps experienced by individual learners and cohorts.

Data from standardised assessments allow practitioners to analyse and compare the performance of learners in their community with learners in similar communities. Identifying what is expected more broadly of learners in their cohort creates opportunities for improving the learning progress of their own group.

Looking for Patterns

Ongoing assessment allows practitioners to look for patterns of misunderstandings before making an overall judgement. Patterns of misunderstandings and misconceptions across a class inform planning of the next step in the learning process for the class. Patterns of misunderstanding and misconceptions by individual learners inform the next steps in developing an individual learning plan.

Measuring learning over time

Measuring learning involves comparing assessment results between two or more points in time. Learning becomes evident when a learner’s achievement increases over time demonstrating they are making progress along the learning continuum.

​Curriculum as a common reference

​A common reference is required if assessment results are to be compared to measure learning. AusVELS is the Foundation to Year 10 curriculum that provides a single, coherent and comprehensive set of prescribed content and common achievement standards which schools use to plan student learning programs, assess student progress and report to parents. The F-10, eleven level, structure of the AusVELS outlines what is essential for all Victorian students to learn and is intended to better cater for personalised learning.

​Measuring learner understanding

To measure learning practitioners can create their own assessment tools or use instruments developed by others. Accurate records of learner performance at each assessment need to be maintained to enable analysis of learning gains and identify learning needs. They also provide the evidence for providing feedback to learners and parents.

​Measuring learning involves conducting an appropriate assessment and analysing the data to identify strengths and learning needs. It is not sufficient to only look at the total number of correct responses or observations – sometimes called percentage correct – for each assessment. Simply comparing percentage correct can conceal important aspects of learning. For example, a learner with an average percentage correct may excel at certain aspects of the curriculum and perform quite poorly on others. Such learning gaps need to be addressed to help the learner make progress in their learning.

​Measu​ring relative growth in learning

The relative growth is a measure of how much a learner's achievement improves over a specified period of time. This growth can vary and depend on factors including the age of the learner, the level of achievement at the start of the period being considered, and the aspect of curriculum being assessed. For example, learning can be fast for some aspects of the curriculum during the early stages of schooling when compared to the later stages of schooling. Further, ​learners starting with a high achievement level may learn at a slower rate than those with a lower achievement level or their learning may progress consistently or may accelerate.

For NAPLAN, the Victorian Curriculum and Assessment Authority uses the concept called Relative Growth to analyse and compare the learning of individuals. Learn more about measuring relative growth ​with this online tutorial

Large-scale assessments

​Large-scale assessment programs collect a lot of data about learning. These programs generally reference curriculum but because they operate over many jurisdictions, each with their own curriculum, these programs generally develop their own scale for reporting achievement.

Australia has a population based National Assessment Program — Literacy and Numeracy (NAPLAN). It also includes a number of sample based assessments that are conducted every three years in the areas of Science Literacy, Civics and Citizenship, and Information and Communication Technology (ICT) Literacy.

​Population based assessments

NAPLAN is a large-scale assessment program that is population based and able to report at the student, school, state and country level. NAPLAN has its own scale against which student results are reported. NAPLAN also uses the concept of Bands and Minimum Standards for measuring learning. Visit the Using Assessment Data website for more information about the NAPLAN scale.

The Australian Early Development Index (AEDI) is another assessment program that involves a full census of the population of learners in their first year of formal full-time schooling. The AEDI provides a comprehensive map of early developmental outcomes across Australia. Visit the AEDI website for more information. 

​Sample based assessments

Australia has three national sample based assessment programs that conduct assessments every three years.

  • Science and Literacy that focuses on Australian students in Year 6
  • Civics and Citizenship that focuses on Australian students in Year 6 and Year 10
  • Information and Communication Technology (ICT) Literacy that focuses on Australian students in Year 6 and Year 10

Visit the NAPLAN website​ for more information about NAP sample based assessments. 

International sample based assessments​

​Australia also participates in three international sample based assessment programs:

  • The Programme for International Student Assessment (PISA) which is conducted every three years by the Organisation for Economic Co-operation and Development (OECD) and which samples 15-year-old school students.
  • The Trends in International Mathematics and Science Study (TIMSS) which is conducted every four years by the International Association for the Evaluation of Educational Achievement (IEA) and samples students in Years 4 and 8.
  • The Progress in International Reading Literacy Study (PIRLS) is conducted every five years by the International Association for the Evaluation of Educational Achievement (IEA) and samples students in their fourth year of schooling.

International sample based assessment programs develop their own framework documents and do not reference a particular curriculum because the assessments are conducted over many jurisdictions. The assessment frameworks detail what knowledge is assessed by the program and the frameworks are used to design and develop the assessment instruments as well as for reporting results. As the underlying assessment frameworks vary between programs, the reported measures also vary as an assessment of Reading in PISA is different from an assessment of Reading in PIRLS, and an assessment of Mathematics in TIMSS is different from an assessment of Mathematics in PISA.

Sample-based assessments are generally not able to accurately make inferences about individual learners. Instead they focus on making inferences about their target populations.

Using assessment data

​​​Using assessment data with cohorts​

Practitioners need to know and understand how their learners are performing as a group to effectively plan learning activities for that group. For some cohorts performance may be relatively even while for others performance may vary exceptionally or be bunched at different levels along the learning continuum. Valid assessment data can provide the required insights to effectively plan for a cohort.

Assessment data collected for one purpose can also provide insights for another. For example, VCE final exam data is made available to schools by the Victorian Curriculum and Assessment Authority (VCAA) in a variety of report formats to allow schools and teachers to gain insights into the strengths and weaknesses of their programs. While no longer relevant to the cohort for whom the assessment data was collected, VCE assessment data can provide schools with insights to inform planning for the next cohort of VCE students.

Using assessment data with individual learners

Most learners progress through learning as a member of their cohort. When an individual learner encounters a difficulty or carries a misunderstanding or is finding the work not challenging the effective practitioner addresses these particular learning needs as soon as possible. Sometimes a practitioner may need to refer a learner to another practitioner with specialised skills for further assessment and evaluation.

​​Using assessment data resources

The NAPLAN online tutorial site contains a series of interactive tutorials for improving the use of NAPLAN assessment data.​