Evaluation of NCEA Online
We’ve been giving students a chance to do some of their NCEA exams online since 2015 (through trials, pilots, practice exams, and end-of-year exams) and thousands have taken part. Students tell us that they prefer it.
Digital learning and assessment are now well established with increasing numbers of schools and students taking part. Find out what they have to say.
2021 NCEA Online evaluation
Students sitting 2021 NCEA Online were asked to participate in an online survey after each end-of-year digital exam to measure their self-reported satisfaction and experience. Our evaluation includes key insights from this student feedback together with data on student and school participation rates. NZQA uses feedback from participating students and information on take-up of end-of-year digital exams to inform our planning for ongoing offerings of digital assessment, in collaboration with schools and the wider education sector.
NCEA Online 2021 Summary Dashboard (PDF, 474KB)
2021 Digital Practice Exams reports
A key element of the NCEA Online Programme and a way that NZQA can support the continuing growth of online assessment, is to provide the opportunity for schools to offer digital practice exams using NZQA’s online exam platform.
We would like all kura / schools to have offered ākonga / students the opportunity to do at least one NCEA Online exam, by the end of 2022.
To support this journey, all schools had the chance to offer digital practice exams using our online platform from 2 August to 31 October 2021. This window let schools pick a time in Term 3 through to early Term 4 that best suited them.
A total of 11,782 students from 152 schools participated, with 857 students doing more than one session. Participation levels were reduced due to the impact of COVID-19 on schools and students but still showed a significant increase from 2020, when 6,236 students from 96 schools participated in at least one digital practice exam.
We asked participating schools and students for feedback about their experience. Find out what they had to say in the summary report (PDF, 525KB) and summary dashboard (PDF, 587KB).
2020 NCEA Online evaluation
NZQA’s findings from the 2020 NCEA Online evaluation show that the programme is well supported by schools and students, who continue to strongly enjoy the digital exam experience.
Students were asked to participate in an online survey after each digital exam to measure their self-reported satisfaction and experience. This feedback has been valuable and is already informing the next steps NZQA is taking with digital assessment, in collaboration with schools and the wider education sector.
Psychometric and statistical analysis of results, which was conducted Levels 1, 2 and 3 in English, Level 1 in History and Level 2 in Business Studies, showed no evidence of disadvantage for those students sitting a digital examination, compared to the paper equivalent.
NCEA Online 2020 Summary Dashboard (PDF, 12MB)
Student Experience Report (PDF, 676KB)
2020 Psychometric Analysis Report
Statistical analyses compared 2020 student performance on the digital and the paper examination formats.
There was no conclusive evidence of a difference in achievement between these two groups of students that could be attributed to the examination format (digital or paper-based).
These findings are consistent with previous years, but for a larger group of students. Examinations covered by the analyses in 2020 included English Levels 1, 2 and 3, History Level 1 and Business Studies Level 2.
The same analyses were conducted for Māori students who participated in the Level 1 English digital examination. Their achievement was compared to a ‘matched ability’ sample of Māori students who did the paper-based Level 1 English examination. There was no evidence of a difference in achievement between these two groups of students either.
For further information please email PRS@nzqa.govt.nz
Matched data analysis methodology detail
The matched data analyses are described in detail below.
A. Identify matching sets of students
1. Identify all digital-format students with at least one paper-format student with identical results in internal achievement standards at the same level.
2. Identify all paper-format students that are eligible for matching to students identified in Step 1 i.e. paper-format students with at least one digital-format student with identical results in internal achievement standards at the same level.
B. Generate 100 resamples and perform analysis
3. For each digital-format student identified in Step 1:
a. Identify all paper-format students with identical results in internal achievement standards at the same level.
b. From the list of paper-format students identified in Step 3a, randomly sample one student.
4. Calculate the grade distribution for externally assessed standards of matched paper-format students from Step 3.
5. Conduct Rasch analysis to estimate the difficulty parameters for externally assessed standards of matched paper-format students from Step 3.
6. Perform Steps 3 to 5 100 times.
C. Compile summary of results
7. Calculate the average of the 100 percentages, for each external standard-grade combination, generated from Step 6.
8. Calculate the mode (most commonly occurring value) of the 100 difficulty parameter estimates, for each external standard-difficulty combination, generated from Step 6.
9. Compare the results of Steps 7 and 8 with the corresponding values from matched digital-format students.
2020 Digital Practice Exam reports
In 2020, schools told us that the impacts of COVID-19 would make it especially useful to have some of their practice exams on the same platform as the end of year NCEA exams. If conducted under exam conditions, the results could be used for derived grades or unexpected event grades. So, we brought our plans forward and in May 2020 began preparing to offer this first opportunity for a limited range of digital practice exams.
We worked with 3 subject associations so that schools could offer digital practice exams for English, Agriculture and Horticulture Science, and Classical Studies at all levels. Digital practice exams in Te Reo Māori were also offered at all levels.
Digital practice exams were available from 31 August to 25 September 2020, with schools identifying the week that they intended to participate so that our business teams could plan the support we were going to provide. 6,236 students from 96 schools participated, with 264 students doing more than one session. Given the short preparation time for this initiative, participation levels were greater than we expected.
Several schools gave us anecdotal feedback about their experience with the digital practice exams. This included comments about how well the exams went and that the supporting guides were useful, along with constructive feedback on areas for improvement.
We also asked for formal feedback from all participating schools and students through surveys of students, school exam administrators, supervisors and markers.
More information is available in the summary (PDF, 344KB) (PDF, 344KB), full report (PDF, 1.4MB) (PDF, 1.4MB)and summary dashboard (PDF, 242KB) (PDF, 242KB)
2019 NCEA Online evaluation
NZQA’s findings from the 2019 NCEA Online evaluation show that the programme is making good progress and students continue to strongly enjoy the digital exam experience.
Students were asked to participate in an online survey after each digital exam to measure their self-reported satisfaction and experience. This feedback has been valuable and is already informing the next steps NZQA is taking with digital assessment, in collaboration with schools and the wider education sector.
Psychometric and statistical analysis of results, which was conducted Levels 1, 2 and 3 in English and Level 1 in History, showed no evidence of disadvantage for those students sitting a digital examination, compared to the paper equivalent.
2019 Psychometric Analysis Report
Statistical analyses compared 2019 student performance on the digital and the paper examination formats.
There was no conclusive evidence of a difference in achievement between these two groups of students that could be attributed to the examination format (digital or paper-based).
This finding is similar to findings in 2017 (PDF, 1.1MB) and 2018 (PDF, 674KB), but for a larger group of students. Examinations covered by the analyses in 2019 included English Levels 1, 2 and 3 and History Level 1.
The same analyses were conducted for Māori students who participated in the Level 1 English digital examination. Their achievement was compared to a ‘matched ability’ sample of Māori students who did the paper-based Level 1 English examination. There was no evidence of a difference in achievement between these two groups of students either.
Read the full Psychometric Analysis Report (PDF, 2MB)
Trials and pilots 2015-2018
NZQA completed a range of trials and pilots between 2015 and 2018 to test our processes, build on our learnings and enable schools to evaluate their readiness to manage digital assessments. The findings of each of these are included in the following:
2018
- The 2018 Evaluation Reports are summarised in a single A3 page (PDF, 767KB)
- Student experience pilot (PDF, 177KB)
- Exam Centre Manager/Supervisor report (PDF, 132KB)
- Marker report (PDF, 115KB)
- Student pilots report (English, Classical Studies, Media Studies) (PDF, 177KB)
- Science student trial report (PDF, 171KB)
- Student trials report (PDF, 177KB)
- Teacher and Principal Nominee experience evaluation report (PDF, 291KB)
- Psychometric Analysis Report (PDF, 674KB)
2017
Detailed Reports
- Teacher Survey - Digital Trials 2017 (PDF, 387KB)
- Student Survey - Level 1 Digital Trial examinations 2017 (PDF, 398KB)
- Exam Centre Manager/Supervisor Survey - Digital Pilots 2017 (PDF, 308KB)
- Student Survey - Level 1 and 2 Digital Pilot examinations 2017 (PDF, 318KB)
- Marker Survey - Digital Pilots 2017 (PDF, 265KB)
- Principal's Nominee Survey - Digital Trials and Pilots 2017 (PDF, 165KB)
- Psychometric Analysis Report 2017 (PDF, 1.1MB)
2016
Appendices
- Appendix A: Psychometric and statistical analysis, English Level 1 externally-assessed achievement standards using digital medium (PDF, 327KB)
- Appendix B: Student survey – English Level 1 Digital Pilot 2016 (PDF, 128KB)
- Appendix C: Student survey – Media Studies and Classical Studies Level 1 Digital Pilots 2016 (PDF, 124KB)
- Appendix D: Student survey – Level 1 Digital Trials 2016 (PDF, 127KB)
- Appendix E: Teacher survey – Digital Trials 2016 (PDF, 166KB)
- Appendix F: Marker survey – Digital Pilots 2016 (PDF, 147KB)
- Appendix G: Examination Centre Manager/Supervisor survey 2016 (PDF, 189KB)
2015
- 2015 Digital External Assessment Prototypes Project (DEAP): key learnings (PDF, 105KB)
- 2015 Digital External Assessment Prototypes Project (DEAP): summary report (PDF, 582KB)
- 2015 Special Assessment Conditions Accessible Technology (SACAT): key learnings (PDF, 121KB)
- 2015 Submitted Subjects: key learnings (PDF, 117KB)