Research, Innovation and Enhancements

NCEA Online delivers digital exams to complement the long-standing paper-based approach. We are now looking to identify and take up the new assessment opportunities offered by digitally enabled teaching and learning.

This work will help us consider how we might deliver future digital exams. 21st century assessment can contribute to the equity of NCEA outcomes by engaging students in new ways, stimulating changes in teaching and learning and providing new ways for students to demonstrate their knowledge.

NZQA’s vision (depicted below) for equitable, innovative, credible and robust digital assessment, involves us working with education leaders, teachers, students and education agencies to co-design the approach.

The following innovation and research work may result in further research, trials, pilots or enhancements to NCEA Online and the way we manage it over the next few years.  We will update this section as we progress this work.


Qualify Future World navigation infographic 700px300ppi

Innovation Trials in Assessment Practice

Te Reo Rangatira

What is involved: Selected schools and kura which offer Te Reo Rangatira will trial a modified digital assessment so students can listen to instructions, questions and resources, as well as reading them.

Why we are doing this work: Māori students have told us they would like to hear as well as read the content of an exam. We are using Te Reo Rangatira to first test this idea in the context of te ao Mātauranga Māori. This may enable Māori students to have a richer and equitable digital exam experience.

Te ao Maori Assessment integrityAdaptability

Digital Statistics and Maths (Calculus) Assessment

What is involved: For subjects such as Statistics and Calculus which are hard to assess digitally we are trialling third-party software used by some schools. The trial involves the current assessment platform and assessment methods.

Why we are doing this: We want to provide digital assessment across all subjects and provide a good user experience for subjects difficult to do digitally. If students can use familiar software (that they have been using for their learning), it could increase their engagement and attract a broader range of candidates.

Progress update: We have completed and analysed the two maths innovation trials carried out in 2020. More information is available in the Digital Mathematics report (PDF, 851KB) and Digital Statistics report (PDF, 912KB).

Digital firstAccessibility and usabilityAssessment integrity

Squirm Germ Gamified Assessment

What is involved: Biology Achievement Standard 90927 has been developed into an assessment based on a game approach.  In 2020 selected schools will participate in a trial with students attempting the assessment and recording their responses.

Why we are doing this work: We want to find out whether assessments based on a game design have more applicability in the future, for example to enable candidates to better show what they know, potentially through being engaging and authentic. 

Progress update:  The innovation trial of the biology assessment based on a game design met the requirements of the standard, and there was a high level of interest and engagement from students and teachers.  More information is available in the summary report (PDF, 287KB).

Digital firstAssessment integrityAccessibility and usability

Research into Assessment Practice

Te Reo Māori (TRM) text-to-speech and spellcheck

What is involved: We are working with Waikato University to look at how we could offer the spellcheck capability present in our English-medium exams, in te reo Māori. This will also look at providing text-to-speech functions in te reo Māori.

Why we are doing this work: Providing spellcheck and text-to-speech functionality in te reo Māori can improve equity of access and outcomes for te reo Māori speakers, better reflecting how they learn.

Te ao MaoriAssessment integrityAccessibility and usability

Special Assessment Conditions software stock take

What is involved: This research will identify technology being used in New Zealand to support students with Special Assessment Conditions (SAC) so we can plan for innovation trial(s) to test the compatibility of widely used applications with the exam platform.

Why we are doing this work: Allowing students in digital exams to use the assistive technology they already use to support their learning will increase equitable access and outcomes.

Progress update: We have completed a stock take of assistive technology tools (which includes application, purpose, suppliers, platform), who is using them, and what further research and analysis is needed for NZQA to consider approving use of assistive technology in digital assessments.  More information is available in the summary report (PDF, 462KB).

Accessibility and usabilityAdaptability

More than one assessment opportunity a year

What is involved: We are researching how we can use the exam platform so NZQA and schools could offer digital exams at various times during the year.

Why we are doing this work: There is strong demand from schools to offer digital assessments, and there is potential for NZQA to stage exams at different times of the year so students can be assessed closer to the time they undertake their learning.

Progress update: In 2020 NZQA worked with the University of Waikato to complete a general literature review and synthesis of key findings from research studies. More information is available in the summary (PDF, 346KB)and full report (PDF, 1MB).

 Assessment integrityDigital firstAdaptability

Remote supervision tools

What is involved: This research aims to identify remote supervision tools suitable for NCEA digital exams and assessments. During 2020 we will identify tools available, their capabilities and how they are used to see what opportunities these could offer for students and NZQA.

Why we are doing this work: This could lead to greater flexibility and resilience of exam and assessment delivery, increasing the types of places and geographical locations where students can be digitally assessed, and supporting human supervision with new tools.

Accessibility and usabilityAdaptabilityAssessment integrity   

Automated marking

What is involved: We are exploring how well-suited automated marking of essay questions is to the NCEA context; what is involved in set up and how marking software works in practice.  During 2020 the University of Alberta – Centre for Research, Applied Measurement and Evaluation (CRAME) will use some past NCEA digital essay responses to complete automated marking, with the results analysed by NCEA markers.

Why we are doing this work: This trial will help us understand if automated marking could enable student responses to be marked in real time and for different questions to be presented to students based on their previous response. This form of adaptive testing allows the software to choose which question to present next.

Progress update: We have worked with NCEA exam markers to review and analyse the results of CRAME’s automated essay scoring system. We are now planning a further trial of automated marking using a larger number of NCEA exam scripts and one or more different vendor/s. More information is available in the summary (PDF, 330KB)and full report (PDF, 1.5MB).

Assessment integrityDigital firstAdaptability

Scoping new services

Alternative delivery methods

What is involved: We are working with schools in New Zealand, Niue and the Cook Islands which lack access to reliable internet connections, a barrier for participating in digital exams. Once we understand the environment, we will trial alternative methods of delivering a digital exam that best fits the schools’ needs and integrates with the digital exams. The trials will be run outside the end-of-year exam period.

Why we are doing this work: This could enable schools without access to reliable internet connections to participate in digital exams and may also add resilience into the system by supporting the delivery of digital exams in the event of major incidents or natural disasters.

Progress update: This work has been paused due to COVID-19 and will be progressed in 2021 if the situation allows.

 Digital firstAccessibility and usability

Externally sourced assessment – digital practice exams delivered on NZQA’s online exam platform

What is involved: We are exploring how NZQA’s online exam platform could be used in schools so teachers can develop and deliver practice exams and activities.

Why we are doing this work: If the online exam platform could be used in schools, it would help teachers and students become more familiar with digital assessment and the platform.

Progress update: In 2020 NZQA worked with three subject associations so that schools could offer digital practice exams for English, Agriculture and Horticulture Science and Classical Studies at all levels using NZQA’s online exam platform. Digital practice exams in Te Reo Māori were also offered at all levels. This was the first opportunity to make digital practice exams available to schools in a limited way.

We have evaluated feedback from the formal surveys sent to schools and students about their experience of participating in digital practice exams. More information is available in the summary (PDF, 344KB), full report (PDF, 1.4MB) and summary dashboard (PDF, 242KB).

Data as an assetDigital firstAccessibility and usability 

Submitted subjects

What is involved: This is looking at using NZQA’s online platform to mark externally assessed work not done by examination. We will research what solution is needed for submitting work for NCEA external standards where assessment is not administered as an exam. 

Why we are doing this work: Students are starting to generate portfolios digitally.  We want to provide a way for teachers to upload school-based content and submit it for assessment by NZQA markers, removing the need for printed materials. If the trial meets requirements, a controlled pilot could be held in 2021.

Progress update: The initial stage of the project is nearing completion.  NZQA has been talking to teachers to understand their needs with submitting student evidence and has been compiling information about subject specific requirements.  This information will be used to select and design the best solution.

Digital firstAdaptabilityAccessibility and usability 

Scholarship online

What is involved: We are undertaking preliminary research to establish the case for a controlled pilot of a digital Scholarship exam.

Why we are doing this work: Increasing numbers of students have completed their NCEA external exams and course work digitally and may expect to complete their Scholarship exam in the same way.  If this initial research suggests there is value, a pilot could look at the impact on students, marking and results of candidates being able to complete their Scholarship exams digitally, in the same way as NCEA exams.

Progress update:  Research completed in 2020 supported NZQA moving to provide digital Scholarship exams and planning for this is underway beginning with a single text-based subject. More information is available in the summary report (PDF, 145KB).

Digital firstAdaptability Assessment integrity

Digitally-delivered Common Assessment Tasks

What is involved: We are investigating the feasibility of offering one or more of the Digital Technology Common Assessment Tasks (DCAT) on the NCEA Online platform in future years.

Why we are doing this work: Increasing numbers of students have completed their NCEA external exams digitally and may expect to complete other kinds of assessment in the same way.

Digital firstAdaptabilityAccessibility and usability  

Te Waharoa Ākonga | Student Services

What is involved: This will tailor the look and feel of the student landing page to improve the experience of students when they log into the NZQA website with their username and password.

Why we are doing this work: This will give students direct access to personalised information about the digital exams they are entered in, reduce the time to log in and access the correct exam and reduce the need for any assistance from supervisors and ECMs at the beginning of a digital exam session.

Accessibility and usabilityDigital first 

Designing new functionality

Scanned paper responses

What is involved: This trial will scan a selection of completed past exam paper booklets and process them through the digital marking platform.  This will allow for marking to be completed digitally.

Why we are doing this work: We are aiming to bring together paper and digital marking and reduce paper interactions within the exam cycle. If this trial leads to scanning all paper responses, they will be marked and returned to students using the digital systems.  It would eliminate the current two-week lag to return paper responses to students after the release of results and reduce the need to have separate quality assurance mechanisms for paper and digital responses.

Progress update: The trial gave NZQA confidence that scanned responses can be marked in the digital marking application RM Assessor3 and that the process is fit for purpose. Recommendations from the trial include preparing a brief for the next phase of this work and further scoping the requirements. This will be done along with benchmarking and check marking during 2021. More information is available in the summary report (PDF, 137KB).

Accessibility and usabilityDigital first AdaptabilityData as an asset

Enhancing benchmarking and check-marking

What is involved: NCEA digital exams are marked online using the RM Assessore-Marking tool. We are using 2019 digital responses to test the benchmarking and check marking feature in the marking application of the digital exam platform. It will enable all marking processes (for both digital and scanned paper responses) to be performed in the one digital environment. This trial may result in piloting the wider use of the digital marking tool.

Why we are doing this work: Any resulting improvements to marking information accuracy will help sustain confidence in the assessment system. Benefits would come from more streamlined delivery of candidate responses to markers and new ways of doing quality assurance.

Progress update:  Panel leaders and check markers have completed their trial benchmarking activities using RM Assessor3, finding that overall the functionality trialled is useful to carry out benchmarking and quality assurance processes digitally.  Recommendations from the trial include progressing digital benchmarking and check marking along with scanning during 2021. More information is available in the summary report (PDF, 110KB).

Assessment integrityData as an assetDigital first

Assessment Master planner

What is involved: This trial will assess some of the features in the digital assessment platform to support the planning and provision of digital exam centres. In addition, this trial will assess the reporting capability for digital exam management, attendance, notification and messaging features.

Why we are doing this work: Testing this new system will allow us to decide if it is ready for use in the digital NCEA exams. This will improve efficiency and the exam experience for Exam Centre Managers (ECMs) and students.

Progress update: The trial has shown that none of the tested basic functionality entirely met the needs of NZQA or Exam Centre staff.  Next steps involve better understanding the needs of the groups who benefit from the platform features trialled; and an analysis to match the AM Planner functions and needs of potential user groups. More information is available in the summary report (PDF, 111KB).

Accessibility and usabilityDigital firstData as an asset 


Skip to main page content Accessibility page with list of access keys Home Page Site Map Contact Us