If you have worked through Levels 3 and 4, you will understand your school’s current provision of support programs and interventions, and you will have developed a priority list of support needs that you wish to address as a matter of urgency.
Focusing on impact rather than activity
Schools do not have the financial or human resources to do everything they would like to do. It is vital that the resources you do have are directed at effective solutions.
Feeling the weight of staff or family expectations, or conscious of Regional Office scrutiny, school leaders can feel rushed to ‘do something’. They might introduce a program, enter into a partnership with an external agency, or arrange a referral. With their huge workload, it is not unusual for schools to then consider that task ‘done’ and move onto the next.
Stopping to consider the impact of these actions can be seen as ‘time consuming’ within an already busy workload. But what if the staff time, financial resources and administrative workload associated with one of these programs is not achieving anything? An investment of time in review and evaluation can lead to a dramatic return – less to do, and more of it proving effective.
With this in mind, you can use the Tool to evaluate the impact of programs or interventions if they relate to the risk factors it contains. The most basic example of this is to look at the students who have undertaken a program, and compare this list with the outcome the program was designed to achieve. Some sample questions might include:
- Of our students who undertook Reading Recovery in their early school years, what proportion are achieving at age-appropriate levels by the end of Grade 3?
- Of the students involved in our hands-on, action learning program, how many have continued to be poor attenders or have been suspended this year?
If the program is having a strong impact on outcomes, you can continue with it – or possibly expand it – with confidence. If it is not having an impact, you can explore why. Perhaps certain key success factors are missing in your context: communication, teaching quality, parent engagement, program leadership, etc. You can choose to review and improve the program, or use the resources to implement a better solution.
A program is not a success because ‘the students like it’, or ‘we don’t have to pay for it’, or ‘it shows that we’re serious about this issue’. These might be bonus outcomes of an effective program, but a program is only effective if it ameliorates or resolves the original problem.
There are three ways to use the Tool to monitor change over time:
- Using the data you have entered in the right-hand side
- Taking time-based snapshots
- Storing data on the right-hand-side for future reference
Using the data you have entered on the right-hand side
Let’s take the sample question above to illustrate this approach. “Of our students who undertook Reading Recovery in their early school years, what proportion are achieving at age-appropriate levels by the end of Grade 3?”
All students engaged in the Reading Recovery program should have had this recorded in the right-hand side of the Tool. Perhaps your school placed the letters RR next to their name in the ‘Literacy Support’ column. Remember that this information will stay on the Tool unless you choose to remove it.
To identify the proportion that are achieving at age-appropriate levels at the end of Grade 3:
- Use the filter in the ‘Year Level’ column to select all students in Grade 3.
- Use the filter in the ‘Literacy Support’ column to select all options containing the letters RR
- You now have a list of all students in Grade 3 who were engaged in Reading Recovery. Check the bottom left-hand corner of the screen to see how many there are.
- Use the filter in the ‘Reading ENG REA…’ column to filter by colour, and select the white box. (You will only be able to do this if you have identified under-achieving students in yellow by conditional formatting.)
- You now have a list of all students who were engaged in Reading Recovery who are now achieving at age appropriate levels.
- Divide the answer at step 5 by the answer at step 3 to discover the percentage.
Wherever you have identified students involved in a program or intervention on the right-hand side, you can later consider the attendance, suspensions, or learning outcomes of that particular group.
Taking time-based snapshots
Every time you ‘save’ the Student Mapping Tool it will create a dated copy in the folder you have created. If your school is refreshing the Tool frequently, to monitor attendance for example, this folder will soon be full of versions of the Tool.
For the purposes of time-based comparison, the following approach is recommended.
- Select and schedule two times when you will refresh the Tool each year. It is probably most useful to do this early in Term 3, after the mid-year academic results have been entered, and in December when academic results are in CASES21 and attendance/suspensions data has not yet been cleared for the start of the new year.
- Ask all staff to ensure the columns on the right-hand-side are up to date before you refresh at these two regular times.
- Only bother to thoroughly colour in the ‘map’ using conditional formatting after these two refreshes.
- Save these two versions of the Tool each year in a separate folder – perhaps called ‘Tool Archives’.
Now you can use all the skills you learned in Level 2 to compare groups of students across time. This will be effective for an individual student (how have they progressed in a particular area in the last two years) or a program (the targeted students began our new maths program with an average VELS assessment of x, after a full year their average VELS assessment is y).
Storing data on the right-hand side for future reference
If you wanted to monitor the progress of a group of students involved in a maths support program, for example, there is another way to do so. When the students first commence the program, capture their last recorded VELS results.
- Enter the code for the support program in the ‘Maths Programs’ column. Let’s call it MC for Maths Champions.
- Create a new column on the right-hand side titled something like ‘Maths Champions baseline Feb 08’
- Use the filter on the ‘Year Level’ column to select the Year Level(s) at which students are able to participate in the program.
- Place the cursor in the first cell in the ‘Number MAT’ column, hold down ‘control’, ‘shift’ and ‘down arrow’ together. This will select all of the cells in the column.
- Right-click on the selected cells and choose ‘copy’
- Scroll across to the right-hand side, past all of the columns already in use, and place the cursor in the first cell below your new column heading.
- Right click on the cell and select ‘paste’.
You now have this baseline data stored for future use. In this instance, you stored the data for the whole year level so that you can easily compare the average VELS progress of those in the program with the average VELS progress of those not in the program. In February the following year, say, you can compare the baseline data with the refreshed data on the Tool.
If you decide to use this approach, you may find the Excel ‘freeze panes’ function useful.
- Place the cursor in the first cell of the column TO THE RIGHT OF the one in which you are interested. If you are interested in the ‘Number MAT’ column, you would place the cursor in the first cell of the ‘Progress ‘at risk’ summary’ column.
- Click on the ‘View’ tab at the top of the screen.
- From the ‘Window’ box, select ‘Freeze panes’
- Select the ‘Freeze panes’ option.
As you scroll across, you will find that everything to the left of the column you selected remains on screen while everything to the right scrolls past. You can now position your ‘baseline’ column right beside the ‘Number MAT’ column on screen.
It is strongly recommended that you copy the NAPLAN data for each year level one at a time, and using the techniques above, paste it on the right hand side in a column labelled 'Past NAPLAN'. In this way, past data can be saved from year to year to compare the progress of each student over time.
Remember you can hide these additional columns containing baseline data until you wish to review them.
Reporting your findings
Every school has a ‘story’ – if you ask the staff, the students and the parents, they will tell you the ‘story’ of the school. Their versions might not all be the same, but there will be some common elements - the school is strong in certain areas, faces particular challenges, is clearly trying to improve in some areas, and is ‘in denial’ about others.
Data is one of the only ways you can change the ‘story’, or the way people perceive your school and its challenges. If you share what you learn through the Tool, you can show that some perceived issues are illusory, some are being successfully addressed, and others are expanding and need urgent attention.
For most audiences, it is important that you do more than quote the statistics. They will need you to include the statistics in a story about the school and its direction. The information will also be more readily absorbed if it is released in small, themed portions.
If you have worked through all the levels of engagement to this point, you are now able to extract an enormous amount of information easily and, as you become familiar with the Tool, rapidly.