Teacher and Principal Nominee experience evaluation report 2018

2018 Digital Trials (Science, English, Classical Studies, Media Studies)

This report covers teacher and Principal Nominee self-reported experience of managing 2018 Digital Trial examinations in their schools.

The Digital Trial Examinations for Levels 1, 2, and 3 English, Classical Studies, and Media Studies were made available to schools in 11 – 28 September and 15 – 26 October 2018. These nine examinations required essay type answers.

The Level 1 Digital Only Science Trial was also available from 18 June to 6 July for its mid-year delivery, and during September 11 – 28 September and 15 – 26 October for its end-of-year delivery along with the other Trials. This Trial was designed to test new digital features, ways of asking questions, and providing responses.

The setup, invigilation and marking of the Digital Trials was managed and run by schools. In total, 129 schools participated in Digital Trials in 2018. The Digital Trials could either be used as a practice examination completed in one sitting, completed over multiple sessions as a classroom activity, or used for revision.

Documentation regarding managing and administering the Trials including guidance on logging in and the use of the marking tool were provided to schools and teachers. Further advice was offered by NZQA through email and telephone support. Familiarisation exercises were made available so key elements of the digital assessment could be experienced before undertaking the Trials.

1.1 Information on the surveys and respondents

Teacher survey

Note # 1

Some teachers and Principal Nominees may have administered more than one Trial examination. Their response to the single survey on their experience may therefore include some overlap in terms of examination sessions. We have treated each survey response as representative of the teacher’s/Principal Nominee’s overall examination experience, whether that was of a single examination session or multiple sessions.We do not have a conclusive overall response rate for this survey.

The teacher surveys (see Note # 1) were designed to measure teachers’ self-reported experience of managing Digital Trial examinations in their schools. We do not have a conclusive response rate for this survey.

20 teachers who managed the mid-year Science Trial examinations completed the survey. The survey consisted of 18 questions requiring 27 responses, of which 7 were open-ended questions. See Appendix One for a list of the teacher survey questions. See Appendix Two for summary tables of responses to the closed ended questions.

70 teachers who managed at least one of the ten end-of-year Trial examinations completed the survey. The survey consisted of 16 questions requiring 17 responses, of which two were open-ended questions. See Appendix Three for a list of the teacher survey questions. See Appendix Four for summary tables of responses to the closed ended questions.

The surveys sent to teachers during the mid-year and end-of-year deliveries of the Science Trial contain different questions (e.g. questions relating to the timing of the mid-year Trial which do not apply to the end-of-year delivery). Where a question was asked in both deliveries, the responses are reported together unless there is a significant difference.

Principal Nominee survey

The Principal Nominee survey was designed to measure Principal Nominee self-reported experience of managing Trial examinations in their schools.

Of the 47 schools that participated in the mid-year Science Trial examinations, 17 Principal Nominees completed the survey – a 21% response rate. The survey consisted of 7 questions requiring 10 responses, of which one was an open-ended question. See Appendix Five for a list of the Principal Nominee survey questions. See Appendix Six for summary tables of responses to the closed questions.

Of the 100 schools that participated in one of the ten end-of-year Trial examinations, 18 Principal Nominees completed the survey – an 18% response rate. The survey consisted of 4 questions requiring 6 responses, with no open-ended questions. See Appendix Seven for a list of the Principal Nominee survey questions. See Appendix Eight for summary tables of responses to the closed ended questions.

The surveys sent to Principal Nominees during the mid-year Science Trial and end-of-year Trials contain different questions (e.g. questions relating to the timing of the mid-year Science Trial which do not apply to the end-of-year Trials). Responses were reported together where the question was asked in surveys for both mid-year and end-of year Trial examinations.

1.2 FINDINGS: TEACHERS

Administration and support

Most of the teacher respondents overall (92%, 83 of 90) agreed or strongly agreed that they were provided with enough information to administer the Trial examinations. Of the teachers who responded to the survey following the mid-year delivery of the Digital Only Level 1 Science Trial, a similar percentage ‒ 90% (18 of 20) ‒ thought that the familiarisation activities prepared their students for the examination.

Of the teacher respondents who set up a digital examination room during the end-of-year Trials (54 of the 70 respondents), 78% (42 of 54) agreed or strongly agreed that it was easy to set up a digital examination room. This question was not asked in the survey for the mid-year delivery of the Digital Only Level 1 Science Trial.

40% (8 of 20) of the teacher survey respondents during the mid-year Science Trial agreed that administering the Trial examination was easy, with no one strongly agreeing. The percentage was greater during the end-of-year Trials. Of the teacher respondents who administered the end-of-year Trial examinations (56 of 69), 79% (44 of 56) agreed or strongly agreed that administering the Trial examination was easy.

75% (15 of 20) of teacher respondents during the mid-year Science Trial reported that there were technical problems or other issues that arose during the administration of the assessments. The percentage was less for the end-of-year Trials, where 54% of teacher respondents (36 of 67) reported that there were technical problems or other issues that arose during the administration of the assessments.

All of the teacher respondents who contacted NZQA for assistance during the mid-year Science Trial (10 of 20) reported receiving useful help and support from NZQA. Of the teacher respondents who contacted NZQA for assistance during the end-of-year Trials (33 of 69), 97% (32 of 33) agreed or strongly agreed that they received useful help and support from NZQA.

Timing of the mid-year assessment

With respect to the timing of the mid-year Science Trial examination, 55% (11 of 20) of teacher respondents to the survey on the mid-year Science Trial examination disagreed or strongly disagreed that the timing was beneficial for students, and 60% (12 of 20) disagreed or strongly disagreed that the timing suited their school’s assessment schedule.

The digital trial examination

Most of the teacher respondents (80%, 71 of 89) agreed or strongly agreed the familiarisation activities prepared their students for the Trial examination, and 82% (73 of 89) set aside up to two hours for familiarising their students with the materials NZQA provided to help students prepare for the Digital Trial examination. 

For the mid-year delivery of the Science Trial, most of the teacher survey respondents (95%, 19 of 20) disagreed or strongly disagreed that their students were more engaged during the Science Trial examination compared to paper-based examinations.

The students who sat the mid-year delivery of the Science Trial were more positive in their response on this topic than their teacher respondents ‒ about half of these student survey respondents (48% 239 of 501) agreed or strongly agreed with the statement ‘I liked doing this examination better than an examination on paper.’ Māori and Pasifika student survey respondents answered similarly, with 55% (41 of 74) and 44% (15 of 34) respectively agreeing or strongly agreeing with the statement.

Teacher survey respondents during the end-of-year Trials had a more mixed view than the teacher respondents whose students had sat the mid-year delivery of the Science Trial, with 50% (34 of 68) disagreeing or strongly disagreeing that students were more engaged throughout the Digital Trial examination compared to paper-based examinations.

Marking experience

For the mid-year Science Trial, the teacher survey respondents were asked about their marking experience through open-ended questions. For the end-of-year Trials, there were no open-ended questions about their digital marking experience but they took the opportunity to mention it in the question about further comments or suggestions about the Trials. For the most part, the marking experience was not received favourably. Commonly mentioned issues were:

Note # 2

As per NZQA policy, the marking solution currently available is calibrated for Pilot examinations in which no ability is offered for giving formative feedback or recording achievement evidence.

  • Inability to provide formative feedback or comments to students (see Note # 2)
  • Not being able to record achievement evidence for each part of a question. No transparency in the decision to award marks
  • Issues with navigating around the exam (multiple scrolling bars on the same window, difficulty in seeing entirety of a student’s answer for long-response questions)
  • Not being able to mark by question across papers (reduced inconsistency)
  • Wanting the marking workload to be smaller
  • The examination marking scheme

Almost all teachers responded with more than one problem experienced while marking. Positive comments about the marking process were mainly that it was faster once they got used to it, compact, did not involve scripts, and that it was straightforward to use.

For the end-of-year Trials, most of the teacher survey respondents (82%, 55 of 67) agreed or strongly agreed that the marking instructions were easy to follow, and 69% (46 of 67) agreed or strongly agreed that the marking process produced fair outcomes for students. However, they were divided about whether marking online took less time than marking on paper, with 46% (31 of 67) agreeing or strongly agreeing with the statement.

Perception of digital assessment

Teacher respondents to the survey on the mid-year Digital Only Science Trial generally had a negative perception of the digital assessment, with 21% (4 of 19) agreeing or strongly agreeing that they would encourage their students to sit the Digital Trial and/or Pilot examinations in 2019, and 10% (2 of 20) agreeing that their experience of the mid-year Science Trial has encouraged them to use more digital tools in their teaching.

Teacher survey respondents for the end-of-year Trials generally had a more positive perception of the digital assessment, with 64% of respondents (44 of 69) agreeing or strongly agreeing that they would encourage their students to sit the Digital examinations in 2019, and 38% (26 of 69) agreeing that their experience of the Digital Trial examinations has encouraged them to use more digital tools in their teaching. Also, 72% (50 of 69) agreed or strongly agreed that sitting the Digital Trials was a good preparation for NCEA examinations. This question was not asked in the mid-year Science Trial survey.

However, a significant proportion of teacher survey respondents for the end-of-year Trials (32%, 22 of 69) agreed or strongly agreed that the students’ results were negatively affected by their computer literacy. This question was not asked in the mid-year Science Trial survey.

Teachers who administered the mid-year Science Trial were given the opportunity to provide feedback on the digital features of the examination. Science teacher survey respondents agreed or strongly agreed that punnet tables (75%, 6 of 8) and video and interactive animation resources (65%, 13 of 20) were helpful in answering the questions. Compared with student respondents, Science teacher survey respondents had a more negative perception of the helpfulness of the formula editor and graphing tool, and also had a more negative perception on the usefulness of all the digital features of the exam.

25% (4 of 16) of teacher survey respondents agreed that the mid-year Science Trial examination offered useful opportunities to Māori and Pasifika students in their class.

Suggestions for improvement/feedback

Teacher respondents were asked to provide any further comments or suggestions. The most common recurring suggestion was to have the ability to provide formative feedback to students (see footnote 2), which is what schools require at that stage of the year, and to make it easier for their students to access their work after marking so they could use it for revision. The next most common comment was that the interface makes it difficult to mark (improving the marking interface).

“…Default font and line spacing is very small and difficult to read, and a number of our staff experienced eye strain and headaches while marking. Please increase the default font to 12 pt and the line spacing to 1.5 in the interests of marker health and safety.”

Another recurring suggestion was to make the process of setting up the examination more straightforward.

“The issue with the one class not being properly set up also was a significant issue for us.”

“The initial spreadsheet required by NZQA is incredibly time consuming and required a lot of checks.”

The teacher respondents also provided some feedback on behalf of their students about their Trials experience.

“Most found it MORE difficult to write equations (physics and chemistry) that if on paper.”

“The students found it hard and annoying to constantly have to scroll up and down to find relevant information to answer the questions.”

“Some students also complained of headaches.”

For the mid-year Science Trial survey, the responses to the question ‘Imagine you had never seen this or any other Level 1 Science examination. How do you think these standards could be assessed differently, either using digital technologies or not?’ were that there would need to be improvements to digital examinations to see it working well or that they didn’t think digital technologies could be useful. Another teacher responded that “Not all questions lend themselves to this type of digital assessing.” A repeated suggestion here and in other questions was to “Get a digital pen for the kids.” One teacher indicated that they thought multichoice was an option, and another teacher suggested a digital test with more intuitive instructions.

Other responses to the question above were to suggest more practical assessment, or portfolios of evidence as assessment.

“A portfolio of evidence submitted by student looking at real life contexts that interest them showing their understanding of the concepts assessed within the standard.”

1.3 FINDINGS: PRINCIPAL NOMINEES

Contact with NZQA

Principal Nominee survey respondents were generally positive about their school’s participation in Digital Trials.

For the mid-year Digital Only Science Trial, most of the Principal Nominees surveyed agreed or strongly agreed that the communications and instructions received about the mid-year Science Trial examination provided sufficient information (82%, 14 of 17) and were well timed (87%, 13 of 15). Relatively fewer felt that they were easy to understand and influenced their school’s decision to participate, with 56% (9 of 16) and 44% (7 of 16) agreeing or strongly agreeing, respectively.

For the end-of-year Trials, most of the Principal Nominees survey respondents agreed or strongly agreed that the communications and instructions received about the Digital Trial examination provided sufficient information (89%, 16 of 18), were easy to understand (82%, 14 of 17), and were well timed (82%, 14 of 17).

86% (30 of 35) agreed or strongly agreed that they were able to confidently support the teachers and students in their school with the information they were provided by NZQA. Also, 97% (30 of 31) of the Principal Nominee survey respondents who contacted NZQA for support in managing the Digital Trials agreed or strongly agreed that they received useful help and support.

The digital trial examination

In the mid-year Science Trial survey, Principal Nominees were asked to identify the factors contributing to their school’s decision to participate. One Principal’s Nominee stated that their school had been approached by NZQA to provide Māori students at a Pūhoro STEM Academy Wananga. The responses below are representative of the main themes.

“Obliging and keen Science teacher. Wanting to be part of the process to see where we as a school could improve. Wanted to gauge the students' reactions, to find out whether it benefits them and whether they prefer it to writing.”

“School aim is to be future ready. Hoped that the trial would provide marking assistance and help teach workload. It did not. Online offered no benefit to students or staff.”

“This system of assessment is very likely to be the direction of future assessment, so it's much better to take the opportunity to engage with the trial at this development stage rather than when it's too late.”

31% (5 of 16) agreed or strongly agreed that NZQA’s approach to their school influenced the participation of Māori and Pasifika students in the mid-year Science Trial examination.

In regard to the end-of-year Trials, 84% (15 of 18) of survey respondents agreed or strongly agreed that they would support their school participating in the Digital examinations in 2019, while in regard to the mid-year Science Trial, 76% (13 of 17) of survey respondents agreed or strongly agreed that they would support their school in participating in future Digital Trial and/or Pilot examinations in 2019.

 

For more information, download the full report (PDF, 291KB)

 
Skip to main page content Accessibility page with list of access keys Home Page Site Map Contact Us newzealand.govt.nz