Student Pilots report

Evaluation Report 2018 (English, Classical Studies, Media Studies)

INTRODUCTION

Nine Digital Pilot examinations were offered in 2018 – Levels 1, 2, and 3 for the subjects English, Media Studies, and Classical Studies. All nine of the Digital Pilot examinations required essay type answers. 6,697 students participated in at least one of the Digital Pilot examinations, from 53 schools. There were 52 examination centres.
5,402 of the students who participated in at least one of the Digital Pilot examinations also participated in a Digital Trial examination held in September and October. 45 of the schools that participated in the Digital Pilots also participated in the Digital Trials.
2,214 of the 6,697 students who participated in at least one of the Digital Pilot examinations in 2018 had participated in at least one of the Digital Pilot examinations in 2017.

1.1 Information on the survey

The student survey was designed to measure students' self-reported satisfaction and experience of the Digital Pilot examinations administered by NZQA during November 2018. The survey included establishing whether the students regularly used electronic devices at home and at school.

Note # 1

Some students took part in more than one Pilot examination and thus may have responded to more than one survey, so some overlap in responses is possible. We have treated each survey response as representative of its examination experience.

NZQA received 1,468 survey responses from 1,422 students. Some students sat more than one digital examination and therefore may have answered the survey more than once (See Note # 1). Students logged into 7,148 sessions ‒ a survey response rate of 21%.

The table below shows the number of responses received, by Pilot examination.

Table 1. Number of responses received, by Pilot examination

Digital Pilot Exam SessionCountPercentage
L1 Classical Studies 0 0%
L1 English 619 42%
L1 Media Studies 124 8%
L2 Classical Studies 24 2%
L2 English 436 30%
L2 Media Studies 81 6%
L3 Classical Studies 17 1%
L3 English 125 9%
L3 Media Studies 42 3%
Total 1,468  

The survey results for each subject are presented together. Where there are differences in the responses by subject and level these are discussed below. The reason for the differences relates to the fact that students experienced technical issues with the Level 1 English digital examination, and a small number of students also experienced technical issues with the Level 2 English digital examination. This resulted in statistical differences in the responses to a few of the questions.

The survey was made available to students within the SoNET system, directly after they submitted their Digital Pilot examination. The survey consisted of 12 questions, with three open ended questions – see Appendix 1 for a list of the survey questions. The survey was designed to take approximately five minutes to complete. Not all the students answered all the questions in the survey. See Appendix 2 for summary tables of responses to the closed questions.

As the survey respondents were self-selected, care must be taken when applying the findings to all the participants in the Pilots.

Text mining tools have been used to attempt to gather information about the most common types of responses to open-ended questions, however they are much more reliable with larger datasets. The outputs for this process for the student survey have been included in the appendix where they gave reasonably reliable answers, but have not been included where they were not useful and there were insufficient commonalities for the tool to identify trends.

1.2 Comparison with previous years

There were six Digital Pilot examinations offered in 2017 (Levels 1, and 2 for the subjects English, Media Studies, and Classical Studies), and there were three offered in 2016 (Level 1 for the subjects English, Media Studies, and Classical Studies).

4,226 students participated in at least one of the Digital Pilot examinations in 2017, from 54 schools, across 4,498 exam sessions. NZQA received 1,068 survey responses from 1,047 students (response rate of 24%).

2. Survey findings

2.1 Overall satisfaction

As stated in section 1.1, this survey had a response rate of 21%. Respondents were positive about completing a Digital Pilot examination, with 97% (1,316 of 1,361) agreeing or strongly agreeing it was a positive experience. This was similar to 2017 when 98% (967 of 990) had the same sentiment.

Responses were analysed separately by Digital Pilot subject and level to determine whether there were any differences in satisfaction levels depending on the type of subject being examined digitally. The respondents for the different exam sessions have a higher proportion who strongly agreed with the statement than those who agreed, except for Level 1 English where those who agreed had a higher proportion than those who strongly agreed with the statement.

Table 2. Responses to the question: "Overall, I found completing this exam    digitally was a positive experience", by exam session.

EnglishMedia StudiesClassical Studies
Level 1 Level 2 Level 3 Level 1 Level 2 Level 3 Level 2 Level 3 Total
Strongly Agree 215
(38%)
237
(59%)
79
(68%)
59
(49%)
49
(62%)
23
(55%)
11
(58%)
8
(50%)
681
(50%)
Agree 320
(56%)
158
(39%)
38
(32%)
59
(49%)
29
(37%)
16
(38%)
7
(37%)
8
(50%)
635
(47%)
Disagree/Strongly Disagree 32
(6%)
6
(1%)
0
(0%)
2
(2%)
1
(1%)
3
(7%)
1
(5%)
0
(0%)
45
(3%)
Total 567 401 117 120 79 42 19 16 1,361

Respondents also indicated a strong preference for completing the examination digitally rather than on paper with 95% (1,296 of 1,362) agreeing or strongly agreeing that digital was preferable to paper-based. This was similar to 2017 when 95% (941 of 990) had the same sentiment.

In response to the question ‘What did you like most about the digital exam?' the survey respondents mentioned that typing is easier and faster than hand writing, and that their hand doesn't get sore from writing. They also said that editing their essays are easier and tidier when sitting the exam digitally compared with on paper, and the spell check was useful.

"I [could] edit what I [wrote] easily and fix up the writing more [comfortably]. My hands [didn't] hurt from writing."

"How I find it easy and faster to type [than] it is to write[,] and that it is easy to go back and add something if I needed."

"I liked how we got to use spell check and I feel that I type faster than I write and that it is easier for me to do so."

"I could write my essay faster, could rearrange lines and paragraphs, and easily add/remove parts to my answers without making it messy as it would've been if it were non-digital. The best part is that I am able to go back to it and change my answers easily."

"It heavily removed the muscle fatigue I [experience] in my hand while writing and allowed me to feel more confident in what I wrote as I [knew] I would be able to alter or remove what I [wrote] easily without wasting space. It felt easier as I spent less time writing things out which provided me with significantly more time to construct ideas."

"The majority of the exam was [laid] out very good with there being no complications. It went very well."

In response to the question ‘What did you dislike about the exam?' the survey respondents mentioned network issues or being worried about the reliability of technology. Some mentioned being distracted by their peers having issues with the exam even if they didn't have the issue themselves. Several aspects of the interface were mentioned, like the spell check not working well, the red box warning standards. The exam taking a long time to set up and the distraction of typing noises were also mentioned. ‘None' or ‘did not dislike anything' were also common responses to this question.

"The server went and stopped working, when I was let back on some of my work had been lost."

"The spell check was nice to have but was not very good. It claimed [a lot] of basic words such as ‘change' were not spelt correctly, even though they clearly were, but other than that it was alright. Did not like how it need to manually be turned on, as I almost missed it.

"The lengthy loading times between questions that ruined my work flow."

"The 550 word count is really annoying as it is hard to write a high level essay in that number, the red box makes you feel not very good."

"It kept coming up with a warning error saying that I kept on clicking out of the exam when I didn't and it did this at least over 20 times during my exam which distracted and worried me."

"Found it easier to write my plan and refer to it in front of me instead of scrolling back up to have a look every time I needed an idea."

"The constant tapping of other key boards. Sometimes I lost my train of thought."

2.2 The digital examination experience

Preparation

NZQA made familiarisation activities available to students who were participating in the Digital Trial and Pilot examinations. The purpose of the familiarisation activities was to provide students with the opportunity to experience the look and feel of a digital examination, including the login and submission process that students would experience, and the different tools that are part of the digital examinations (copy and paste, drag and drop, spell check, examination lockout, and examination preview).

81% of respondents (1,135 of 1,408) agreed or strongly agreed that they found the familiarisation activities useful in their preparation for their digital examination. This is higher than the percentage for 2017, when 74% of respondents (752 of 1,011) had the same sentiment.

14% of respondents (197 of 1,407) said that they did not know the familiarisation activities existed. This is less than the percentage for 2017, when 21% of respondents (213 of 1,011) said they did not know the familiarisation activities existed.

On the day

53% of respondents (359 of 681) reported using their own device to complete their Digital Pilot examination, and 47% of respondents (322 of 681) reported using a school-provided device. These percentages were similar to 2017 when 46% (236 of 516) and 54% (280 of 516) of the respondents used their own device and a school-provided device respectively.

78% of respondents (1,049 of 1,340) reported they used a laptop to complete their Digital Pilot examination, and 21% (287 of 1,340) reported they used a desktop computer. Only four respondents (< 1%) reported they used a tablet. These percentages were similar to 2017 when 72% (677 of 940), 27% (258 of 940), and <1% (5 of 940) reported using a laptop, desktop computer, and tablet respectively.

66% of the respondents (944 of 1,425) reported experiencing no network or device problems when accessing or completing the Digital Pilot examination. This is lower than in 2017 when 83% (853 of 1,022) reported having no network or device problems when accessing or completing the Digital Pilot examination.

Note # 2

Students sitting the L1 English Digital Pilot examination experienced a short-term loss of connectivity at 11.39am. NZQA investigated the cause and made system changes for the next and subsequent digital examination sessions so this issue did not recur.

Of the respondents in 2018 who experienced issues during the examination, 29% (414 of 1,425) reported experiencing network problems and 5% (67 of 1,425) reported experiencing device problems. Of the respondents who experienced network problems, 91% (376 of 414) had sat the Level 1 English exam (See Note # 2).

98% of respondents (1,339 of 1,366) found it very easy or easy to navigate through the Digital Pilot examination, and 98% of respondents (1,336 of 1,361) found entering their responses to the Digital Pilot very easy or easy. These were similar to 2017, when 98% (968 of 992) and 97% (963 of 991) had the same sentiments respectively.

92% of respondents (1,252 of 1,365) agreed or strongly agreed that completing the Digital Pilot examination took less time than they would have expected had it been paper-based, which was similar to 2017 (92%, 914 of 989).

2.3 Digital technology at home and at school

Most of the survey respondents reported having access to laptops and smart phones at home (96%, 1,397 of 1,461 and 90%, 1,316 of 1,461 respectively). About half of the survey respondents reported having access to desktop computers and tablets at home (51%, 852 of 1, 461 and 47%, 690 of 1, 461 respectively).

30% of the survey respondents (434 of 1,461) reported having access to all the four devices mentioned at home, and 33% (483 of 1,461) reported having access to three of the four devices at home.

To support their learning, students survey respondents reported that digital technology was very often or quite often used in class (90%, 1,307 of 1,457) and for homework (90%, 1,303 of 1,449). Digital technology was relatively less often used for internal assessments, where 77% (1,122 of 1,450) reported to using it very often or quite often for internal assessments. Very few respondents reported never using digital technology to support their learning in class or for homework. Only 2% (32 of 1,450) and 1% (9 of 1,449) reported never using digital technology for internal assessments and homework respectively.

2018 findings show that proportionately more respondents are using digital technology to support their learning in class, for homework and for internal assessment than Pilot survey respondents indicated in 2017 or 2016.

2.4 Suggestions for improvement/feedback

At the end of the survey, respondents were asked whether there were any features or functions that they thought future digital exams should include, or whether they had any other feedback.

Common themes from respondents include improving the spell check and word count functionalities, the ability to highlight texts, and fixing scrolling issues. They also mentioned features of other commonly used technology like auto-capitalisation in Microsoft Word or Google Docs.

"Automatic capital letters at the start of new sentences like word."

"Having a highlight option with the unfamiliar text section so students like me can follow through with what they have highlighted and so they can find what they were doing a lot quicker."

"Make sure it can handle [everyone] in NZ."

"You should make the spell checker automatic. That way we can check as we type."

"Is it possible to make the screen bigger when writing our response. It only fits half the screen, making it harder to see what I'm writing."

Some people were unable to locate the spell checker function or were not aware that it existed:

"Spellcheck would be nice for typos, I don't know [if] spelling [is] still marked at this level."

"If there was a spellchecker I was unable to locate it, an quick and easy spell checker would be good since human error makes accounts for small problems that could make a markers job harder. We aren't marked on spelling, so why not?"

"Not realising that there was a spell check option until after the exam had finished."

The survey responses indicate that students' experiences of the Pilot were very positive. Nonetheless there was valuable feedback from the students that will inform future developments. Electronic devices are widely used by students, and some responses to open ended questions indicate that digital functionality they are very familiar with elsewhere (such as Word's automatic capitalisation and spell check) is not understood by students to be available, or is actually unavailable, or requires improvement. The survey responses also indicated that some students were not sufficiently familiar with the spell check or word count features of the examination for which the same instructions were provided as in the paper examinations of the respective subjects and which were included in the familiarisation exercises that were made available prior to the examination.

 

For more information, download the full report (PDF, 177KB).

 
Skip to main page content Accessibility page with list of access keys Home Page Site Map Contact Us newzealand.govt.nz