Student Science trial report

Evaluation Report 2018 Level 1 Digital Only Science Trial

INTRODUCTION

The 2018 Level 1 Digital Only Science Trial was available from 18 June to 6 July for its initial mid-year delivery and then from 11 – 28 September and 15 – 26 October as part of the end-of-year digital Trials. The defining features were:

  1. It was a “digital only” examination in that it could not have been replicated in paper form because of the extent of the digital features; and
  2. There was a mid-year delivery in the last three weeks of Term Two so that we could better understand whether there was any value for schools and students in having an assessment opportunity closer to the time that students may have undertaken their learning.

This was an innovation trial designed to test new digital features, ways of asking questions and providing responses.

The insights from the 2018 Science Trial are informing the operating model and digital assessment service design for 2020 onwards. With respect to the digital features that extend beyond static resources and text manipulation, we believe it is important to keep working with schools on co-designing engaging and relevant approaches to external assessment. Examinations need to be equitably accessible and usable, reflective of the teaching and learning, and taking advantage of the digital technologies available.

1.1 Evaluation approach for the Level 1 Science Trial

This Trial was set up to have the same objectives framework as the 2017 Trials and Pilots. The objectives include to:

  • maintain momentum and further develop engagement in digital assessment
  • deliver strategic value
  • demonstrate moves toward innovation in external digital assessment.

The primary evaluation method was by user survey.

1.2 PARTICIPATION BY DECILE, ETHNICITY, AND GENDER

Note # 1

Students were able to participate in one or both of mid-year and end-of-year deliveries of the Science Trial, provided they did not attempt the same standard in both. There were 7,866 registrations overall from 96 schools for both deliveries of the Science Trial.

Note # 2

Note that school roll size varies across decile bands. There are relatively fewer students enrolled at lower decile schools.

There was a total of 5,176 recorded student logins (from 5,051 distinct students across 82 schools) (see Note # 1) for mid-year and end-of-year deliveries of the Digital Only Science Trial. Table 1 shows the participation by Science Trial delivery and Table 2 shows the student logins by school decile (see Note # 2).

Table 1. Science Trial Participation by Delivery

DeliveryStudent LoginsNumber of Schools
Mid-Year 3,491 47
End-of-Year 1,685 40
Total 5,176 82 (see Note3)

Note # 3

Five schools participated in both the mid-year and end-of-year deliveries of the Science Trial.

Table 2. Science Trial Participation by School Decile

School DecileStudent Logins
1 176
2 6
3 301
4 805
5 493
6 925
7 652
8 577
9 377
10 819
99 45
Total 5,176

Table 3 shows student logins by ethnicity. Māori and Pasifika students comprised 23% (1,214 of 5,176) of all the Science Trial participants.

Table 3. Science Trial Participation by Total Response Student Ethnicity (See Note # 4)

EthnicityStudent LoginsPercentage
Māori 867 17%
Pacific Peoples 425 7%
Asian 877 17%
European 3,598 57%
MELAA (See Note # 5) 106 2%
Other 36 <1%
Unknown 21 <1%
Total 5,176  

Note # 4

Ethnicity is reported using total response methodology, where each student is included in up to three ethnicities that they identify with. Due to this, the sum of individual ethnicity data may be larger than the total number.

Note # 5

MELAA stands for Middle Eastern/Latin American/African

Table 4. Science Trial Participation by Student Gender

GenderStudent LoginsPercentage
Female 2,554 49%
Male 2,608 50%
Unknown 14 <1%
Total 5,176  

2. FINDINGS

2.1.1 OVERALL SATISFACTION

As stated in section 1.4, this survey had a response rate of 20%. Students who responded to the survey had mixed satisfaction with the examination compared to doing the examination on paper. Regardless of whether they sat the mid-year or end-of-year delivery of the Science Trial, about half of the student survey respondents overall (49%, 437 of 884) agreed or strongly agreed with the statement ‘I liked doing this examination better than an examination on paper.’ Māori and Pasifika student survey respondents answered similarly, with 55% (79 of 144) and 44% (29 of 66) respectively agreeing or strongly agreeing with the statement.

In response to the question ‘What did you like most about completing the exam digitally?’, student survey respondents said that it was easier and faster to type than to handwrite, that answers could be easily reviewed and corrected, and their hands didn’t get sore from too much writing. The second most common response to this question was that the video was helpful in answering questions.

The most common responses to the question ‘What did you dislike the most about completing the exam digitally?’ were that the graphing tool and formula editor are difficult to use compared with writing on paper, and that there were issues with videos and animations when the Wi-Fi or computers were slow. “Nothing” and “everything” were responses here also.

2.1.2 THE DIGITAL EXAMINATION EXPERIENCE

Most of the survey respondents (76%, 755 of 991) agreed or strongly agreed that the familiarisation activities told them what to expect in their digital examination. At end of year a greater percentage agreed or strongly agreed (81%, 342 of 423) compared to at mid-year (73%, 413 of 568). However, 10% (104 of 991) responded that they did not know that the familiarisation activities existed. That said, fewer end-of-year respondents said they did not know that the familiarisation activities existed compared to those who sat the Trial assessment at mid-year ‒ 8% (34 of 423) compared to 12% (70 of 568).

Note # 6

It was difficult to discern how student survey respondents distinguished between the “helpfulness” / “usefulness” questions regarding interactive resources and tools, as these sets of questions were asking for very similar information. Although the design of these questions may have allowed some potential inconsistency in what this data may be telling us, the responses to these questions have been reported in Table 5 separately.

Table 5 shows that the digital features of the examination were received positively by the survey respondents (See Note # 6)6. Most of the survey respondents agreed or strongly agreed that punnet tables (82%, 626 of 759), video and interactive animation resources (80%, 720 of 897), formula editor (73%, 598 of 821), and graphing tool (61%, 504 of 828) were helpful in answering the questions. The formula and graphing tool were the subject of negatively inclined feedback in open ended questions. On average the features received a neutral usefulness ranking from student survey respondents.

Table 5.Summary of student perception of digital features of the examination (excluding Not Applicable responses).

Examination digital featureHelpfulness( See Note # 7)Usefulness rating(See Note # 8)
Agree and strongly agree Disagree and strongly disagree 4 & 5 (useful rating) 3 (neutral) 1 & 2 (not useful rating)
Punnet tables 82%
(626)
18%
(133)
40%
(283)
31%
(219)
29%
(207)
Video and interactive animation resources 80%
(720)
20%
(177)
40%
(343)
26%
(221)
34%
(297)
Formula editor 73%
(598)
27%
(223)
37%
(294)
28%
(225)
34%
(271)
Graphing tool 61%
(504)
39%
(324)
32%
(255)
30%
(237)
37%
(295)
Total 2,448 857 1,175 902 1,070

Note # 7

Question 5: I found the following features of the exam helped me to answer the questions.

Note # 8

Question 6: Rate the following features in the exam from not useful (1) to extremely useful (5) or Not applicable.

The response profile for Māori and Pasifika student survey respondents to questions about their perception of digital features was compared to the overall response and was not significantly different.

2.1.3 DIGITAL TECHNOLOGY AT HOME AND AT SCHOOL

Most of the respondents completed the digital examination on a laptop (65%, 640 of 982) or a desktop (35%, 340 of 982). A very small percentage (<1%, 2 of 982) reported using a tablet. Most of the devices were school provided (60%, 534 of 892) while the rest (40%, 358 of 982) of the devices used were the students’ own.

Respondents indicated that digital technology was very often or quite often used for homework (75%, 736 of 986) and in class (55%, 550 of 998) and not as much for internal assessments, with 42% (412 of 977) indicating that digital technology was used very often or quite often for internal assessments. Students who sat this Trial assessment at its end-of-year delivery were more likely to respond ‘very often’ or ‘quite often’ to the last (internal assessment) part of this question than the mid-year delivery students (55%, 231 of 417 compared to 32%, 181 of 560).

See Appendix One: SUMMARISED RESPONSES BY QUESTION for more detailed breakdown of responses.

2.1.4 SUGGESTIONS FOR IMPROVEMENT/FEEDBACK

In response to the questions ‘Do you have any feedback about this digital exam?’, and ‘Do you have any ideas for how these standards could be assessed differently? How?’, student survey responses ranged from liking/enjoying the digital examination and that no improvement was needed, to just doing the exam on paper.

Suggestions for improvement included making the graphing tool and formula editor easier to use as well as general improvements to the interface. Some suggested having the flexibility to allow students to answer word questions digitally and to write graphs and calculations requiring equations and formulae by hand, and that only those examinations that require minimal equations and formulae should be digital. Other suggestions included using a tablet with stylus, having spell check, on-screen calculator, keyboard shortcuts, and improving the administration of the Trial.

SECTION HEADING

2.1.5 QUOTED RESPONSES BY MAIN THEMES

Student responses to the questions above have been grouped according to their main themes. These responses are selected as a possible representative set of responses.

Generally positive comments:

“It was good and really helped and applied to me as my hand writing is terrible and I’m good with computers so it caters to me very well.”

“I very much like this better. No more sore hands!”

“It was a great experience to test the new functions of digital exams.”

“I don't really have any complaints about it.”

“It allowed me to focus more on my answers to the questions, as typing the answers is more efficient than writing them out by hand.”

“Easier to write, with less strain on wrists. Could easily undo errors instead of scribbling.”

Generally neutral comments:

“Needs some touch ups, but not horrible.”

“It was good but there is still lots of work to be done and at this point I would much rather complete a paper exam as it is easier and quicker to write out formulas etc.”

“I think the only exams assessed digitally should be english, history and geography or exams that revolve around essays. I found this science exam hard to do online and mathematics should definitely NOT be assessed online.”

“The digital exam itself was good but it was just problems with Wi-Fi, loading and getting everything to work.”

Generally negative comments:

“I prefer paper exams because of convenience when it comes to writing, it’s just what I’m used to.”

“I personally like writing on paper because it helps me get all my ideas down but I found that digitally, it's harder to get ideas down.”

“I found it so much harder to concentrate because it gave me a headache and my eyes got tired… I also can't type very fast at all so it would be hard to finish it in time.”

“The chance that there always is of your computer causing you problems...”

“I don’t think enough effort has really been put into it to so that it is user friendly.”

“I really disliked the idea of having to click many buttons to make the formula and that typos were frequently being made and made me spend more time on rechecking my explanations.”

Positive feedback on digital features of examination:

“[I liked] the short videos and images to really get a good idea of what the question was asking.”

“The interactive animation resources and videos were helpful because you could see exactly what was happening almost as if you were doing the experiment in real life.”

“The punnet square helped me answer the question because I had an example to use in the answer. I understood it better when I could visually see what the question was talking about during the first video.”

Neutral feedback on digital features of examination:

“Easy to input answers, formulas take a while but not long enough that it hindered in anyway.”

“I thought that the picture of the bun raising in the oven was cool but I'm not sure if it was helpful.”

“It was [boring] and tedious the only thing that is better that pen and paper is the videos but other than those the rest is still inferior to pen and paper in every way.”

“They didn’t, the only thing that helped was the videos and animation, the graphing tool did not work as well and took far too much time as well as the punnet squares and formula editor.”

Negative feedback on digital features of examination:

“They slowed me down and made it harder in general.”

“It took so much time to fill out the equations and having to click on everything [,] when on [paper] it would take less than 5 seconds to [answer] the question. And the graphs [were] so bad I tried my best but I still [didn’t] even know what my own graph [meant].”

“I cannot say that the graphing tool is very good. It needs more development and has to become easier to use. It frustrated not only me, but some other people, who said that after the exam they had to draw their lines manually due to the curve tool being hard to use.”

Suggestions for improvement:

“The formula writing ease of access (shortcuts) should be improved so that writing formulas takes less time.”

“I would like there to be an autocorrect feature for non-english exams as I found myself correcting my typing quite often which side-tracked me from the questions.”

“Maybe make it more straightforward online rather than teachers having to give all these instructions, hand out codes, write links, needs to be a little less complicated.”

See Appendix Two for a more detailed breakdown of the responses to the open-ended questions.

 

For more information, download the full report (PDF, 171KB).

 
Skip to main page content Accessibility page with list of access keys Home Page Site Map Contact Us newzealand.govt.nz