We believe in NAPLAN… and we don’t

We believe in NAPLAN… and we don’t

David Wright, CEO
,

At Edrolo we believe in NAPLAN. And we don't.

The assessment approach used by NAPLAN is based on well-regarded international tests such as the MAP test administered by NWEA. This method of assessment forms a large part of the basis of international assessments such as PISA that measure growth in the competencies generally accepted as critical to the future success of our students.

NAPLAN is Australia’s attempt to measure how well we as a national education system are setting our students up for future success based on literacy and numeracy competencies.

Australia has done poorly in these international assessments over the past decade - consistently underperforming our peers.  And so we should care a lot about NAPLAN.  

Why we believe in NAPLAN (At least in principle...) 

At Edrolo, we care a lot about skills and competencies:

1. Boring, maybe?… Certainly not to us! Our schools must trust that our system meets and exceeds the requirements of the relevant curriculum.   Almost all curricula have evolved to increasingly require at least a minimal, and sometimes more significant development of skills and competencies.

2. We know from our research into how to achieve the best possible results that skills and competencies including literacy, numeracy, problem solving and understanding are essential to getting those results in standardised Year 12 testing.

In a recent research study we performed (and shared with our schools) we were able to demonstrate that one of the leading causes of loss of marks in Year 12 HSC exams was non-subject specific skills such as representing data - in one specific example: graphing in Science subjects. 

In exam preparation and practice, the capability to handle unfamiliar situations is not only a great contributor to success but all curriculum assessment bodies in Australia are moving to greater majority percentages of unfamiliar questions.

3. Parents care and employers care.  Results matter, but these are a means to a more important end, the most successful future for our students.  Recruiters for jobs put a much lower premium on recollection of knowledge given access to knowledge is already ubiquitous, and mostly free, and AI is just the latest cause for the acceleration of those beliefs.  

What employers value in employees is knowing what questions to ask, problem-solving, resolving conflicting information, understanding what the problem actually is, and being able to communicate and collaborate.  Employers care a lot about these and have been raising the red flag on the failures to address shortfalls in these areas for at least 20 years.  

Parents are mostly employees and employers themselves - so they know how important this is. They don’t know if their child will go to university or not, or what career path they might make, but they do know any path will be improved with these skills and competencies, and harder without. When selecting schools, NAPLAN results are one of the most highly used comparators. 

And now some universities are rising to this challenge. As a former vice president of a university, I saw what every employer sees, students coming into university and struggling because of a lack of these skills and competencies. These struggles continue well into their graduate careers.  Universities are now both working with schools to develop these skills earlier so students don’t fall behind, but also seeking out talented students based on skills and competencies.

We believe addressing skills and competencies, including numeracy and literacy, is vital to the success of our schools and students and we believe that you can’t sustainably improve a system without measuring it.  

Investing in skills and competencies

As a company we have invested heavily into the importance of skills and competencies, and have built them into the core fabric of our learning and teaching system.  

Every subject has clear and significant learning, teaching and practice activities to grow students skills and competencies. 

This is not easy to do properly - it is easy to copy a question from previous exams or other materials. It is much harder to design the question, answer and explanation to develop multiple outcomes, and it is harder again to scaffold them in a way that builds and tests to what is needed at some time in the future, like a Year 12 exam. Some of our subjects can take over a year of effort just to achieve this. 

It is much harder to design the question, answer and explanation to develop multiple outcomes

For a teacher or a student to sustainably build a skill or competency (what we call ‘making learning stick’) you can’t just be exposed to it, you need to practise and apply it often. You need what schools don’t have a lot of, time!   

So you need to create lots of opportunities to practise and make those activities progressive so you enhance the skills - not see them and forget them.  But without additional time you have to embed these in current activities.  

This is the challenge we have taken on. Most of our users don’t even realise the amount of skill and competency development in our product. Though we have tried to show where they are in many of our new releases. 

For example, in our Year 7-10 Maths resources, this is best seen in our problem-solving and reasoning sections.  Our ‘Remember this’ sections (built to enable spaced repetition), part use NAPLAN questions as antecedents.  

Several teachers have picked up on this, especially those in schools that are focusing on NAPLAN, and those teachers use those questions a lot.

In Science, skills development is perhaps best seen in our SHEs (Science as a Human Endeavour sections - weird acronym though right!), and significant foundational skill elements are incorporated throughout every set of questions or activities.

This investment never stops. Daily, we are learning what works and what does not, and both optimising the system and feeding information back to our schools as input to their decisions and strategies - this is our  ‘data-driven learning loop’.  With over a million students and millions of learning loop actions we have a lot of data to draw on to continually improve the system. 

Why NAPLAN isn’t working

Despite the good intentions of governments in implementing NAPLAN In the public arena, NAPLAN has become a blame game.

Schools, and everyone in them, are working as hard as they can to educate every student. They want these outcomes for students but also need to be confident that the solutions proposed are worth the time that is already stretched to breaking point or beyond. NAPLAN is a significant effort but the benefits to students are at best highly uncertain and at worst are as published - taking our students backwards. 

Like our schools, governments want to see a measurable improvement in skills and competencies for our students, and are trying to implement a critical plank for the success of any change - a measurement framework. They have increasingly provided results with more useful information.

There is a saying (I am not sure who the author is): ‘Life (experience) is a hard taskmaster, she gives you the exam first and the lesson later!’  NAPLAN tests something that isn’t actively developed to the level needed to get the results governments are looking for, and assumes the lesson will be delivered later. The lesson should be provided first. Not just knowledge delivery but learning, teaching and practice that is measured and continuously improved.

Lagging data

NAPLAN assessments are done every 2 years starting in Year 3 and finishing in Year 9. Every 2 years means that too much time has gone by to look at cause and effect - students have experienced many different subjects, teachers and even schools in that time. 

This is lagging data that needs to be leading data. The best similar systems in the world assess at least twice a year.  Schools can adjust within the current class at least twice a year with the same teachers and same classes.  Some schools even practice and assess skills weekly. We think this should be daily, but twice a year would be a major improvement and could be done efficiently with current technology like ours.

The results used to arrive 3 months late. Now preliminary results are supposed to arrive in 4 weeks.  This is a significant improvement, previously the year was more than half gone before the results arrived.   

However, results point to deficiencies once every 2 years but are not able to be collaborated with a measurement framework based on the actions that are happening all the time.  So even though it is one month - that money is not that relevant.

Optimising a system once every 2 years against the last result 2 years ago makes little sense.  A school getting results in May can optimise for that point in time but this is the middle of the school year when planning is already done and large shifts in strategy are hard and those strategies and circumstances will change at the end of the year.  Results would perhaps be better placed to line up with planning cycles but even then doing this every second year would have limited value.  

A much better outcome is schools should be able to calibrate regularly against ongoing measures of growth so minor and less time-draining adjustments can be made regularly driven by cause and effect type analysis - i.e. which of the activities over the past two years contributed in what measurable way to a result doesn’t work but which activity over the last week or even term does work. 

More granular data

The data provided in the results is also not granular enough to effectively use to help a student, their parents or carers, teacher or school. Data in school and private reports ranks students against others in similar years on a limited set of aggregated dimensions.  

This has a low level of usefulness because it is not part of a coordinated system that measures ongoing progress against those skills and competencies that make up those aggregated dimensions.  

To make change you need granular data on what is needed to inform our professional experts - our teachers so that they can use the skills to improve results, you need to give them the tools, training and development to help deliver those results and you need easy to execute activities that can action immediately and are measured and fed back to show that worked and what did not.  

Education is a team sport, so all this needs to be brought together in a truly collaborative way, that identifies the help first and the measurement later. 

Students love Maths with Edrolo

Give students the skills and conceptual understanding they need to succeed in Maths.
Find out how

Students love Science with Edrolo

Give students the skills and knowledge they need to succeed in Science.
Find out how

Students love the Humanities with Edrolo

Get students engaged and loving the Humanities.
Find out how

Give your VCE students the Edrolo edge

High-quality theory, activities and exam practice to help students succeed in VCE.
Find out how

Give your HSC students the Edrolo edge

Video theory, activities and exam practice to help students succeed in HSC.
Find out how

Give your QCE students the Edrolo edge

Video theory, activities and exam practice to help students succeed in QCE.
Find out how

Give your senior students the Edrolo edge

Video theory, activities and exam practice to help senior students learn and succeed.
Find out how

Students love learning with Edrolo

Get students engaged, learning and loving Maths, Science and the Humanities.
Find out how

Give your senior students the Edrolo edge

Save time with high-quality resources that get students at any level engaged, learning and loving your subject.
Find out how

Stay in the know

Follow @EdroloHQ on Twitter, LinkedIn or Facebook to get lesson ideas, resources and our latest news including competitions and giveaways

More articles