Te Kete Ipurangi Navigation:

Te Kete Ipurangi
Communities
Schools

Te Kete Ipurangi user options:


Ministry of Education.
Kaua e rangiruatia te hāpai o te hoe; e kore to tātou waka e ū ki uta

Creating a diagnostic tool


Download the Year 7 Diagnostic test
 (PDF file, 1.6MB)

Planning a Technology programme for a new intake of students can be problematic given that teachers often don't know about their students' prior experience and knowledge and can't assume the level of Technology they may, or may not, have done previously. Without this information it is difficult to target the level students are working at and where they need extra support, and finding it out can be very time-consuming.

Dilworth School, an independent boys' school run by a trust board, takes its students from all around the North Island – such a huge 'zone' means that each new student usually comes from a different primary school. Teacher Sarah Blenkiron points out she couldn't realistically gather data from all these schools about students' experience in Technology so could only identify their experience and knowledge later in the term, and then modify her programme to better reflect the level at which they are capable of working.

Students in the junior school start at Year 5 or Year 7. About two-thirds of all Year 7 students are new to the school, so with both intakes Sarah has to determine their understanding of Technology and the level at which they are working. Sarah found anecdotally that a lot of Year 7 students coming in to the school.

"Don't have the faintest idea of what Technology is."

Sarah Blenkiron

Creating a diagnostic tool

In 2009, when Sarah was part of the Resource Development and Facilitation: Years 7-10 Technology Project, she decided to create a diagnostic tool that would help her determine the level of understanding in her three Year 7 classes. This would be in the form of a questionnaire for all Year 7 students, regardless of whether they were new or not. Sarah started drafting the tool mid-year and wanted to include the new strands and their components. She worked through several drafts and trialled different versions before she had it ready for her 2010 Year 7 students.

The diagnostic tool was planned for a one-hour lesson and looked at all eight components of the three strands. Most questions were directly linked to the components, mostly at Curriculum Levels 1 and 2, although a few questions were based around Level 3 or Level 4 concepts (such as the manipulation of materials within the Technological Products component). A few other questions were linked to projects that the students would be working on later that term. The accepted Technology terminology was used even though some students wouldn't be aware of the meaning and Sarah notes that this could show what they'd been taught, if anything, or whether they could work out a viable answer.

The questionnaire put slightly more emphasis on the Technological Practice strand, which Sarah introduced by using an existing product with which the students were familiar (pen tidy). The questions in this section looked at both the overall process of developing a product and some of the different aspects involved in it. Even the boys new to Technology could attempt some of this although they may not be able to answer the questions with specific Technology vocabulary such as 'stakeholder' and 'modelling'.

Sarah wanted the diagnostic tool to be more than a pen-and-paper test, so, in order to retain student interest, she incorporated a variety of activities: multiple choice; matching descriptions; rearranging pictures; one-word answers; full-sentence answers; and drawing diagrams.

Implementation

Sarah considered how she could most effectively present and run the diagnostic tool; she decided to give the pre-test (so-called because Dilworth students are familiar with this term) in the first Technology lesson but went to their classroom so that items in the Technology workroom wouldn't influence the students' responses.

Technological outcomes from the street perspective.

Technological outcomes from the street perspective.

She brought a range of resources into the room for the students to look at, touch or take to their desks for a closer look. Some, for example, took the opportunity to disassemble a torch to examine it. Some questions related to photos and Sarah says she spent quite a while finding a New Zealand 'street scene' image with lots of different outcomes for the students to identify as 'natural' or 'Technological Outcomes'. She also had larger colour copies of all the photos in the diagnostic tool for the students to look at.

The students were reassured that the pre-test wasn't an exam but for programme planning with each question linked to a different strand of the curriculum. Sarah explained that she wouldn't tell them what the terminology meant and that if they hadn't been taught something or didn't know an answer then they could just say they didn't know or leave it blank. Questions were allowed, but anyone asking something like "What is a Technological Outcome?" was told that not knowing was fine and to put that down. As in every class, there were students who finished early and others who ran out of time. Sarah encouraged the latter to leave questions they didn't understand and move on to others, while those who had finished worked on a research task related (although they didn't yet know this) to their first project.

Sarah found the results useful for refining her 2010 programme, noting that it gave her a good understanding of the level at which the boys were working and where they needed more help to get to the next level. In some questions, some students weren't at Level 1, although this varied depending on the question; for example, most got Question 2 ("Give examples of the street scene that are Technological Outcomes") correct which, says Sarah, "showed me that even if they didn't understand the terminology they could figure it out and make a really good stab at it, so that was quite encouraging as well".

Test answers were, as expected, quite varied. The question 'What is Technology?' led to replies such as "It's a subject at school", "An electrical device that will help someone", and "It's something that makes things easier for the human race" – the latter answer showing that some students did have a developing idea of Technology.

Technology system of a bicycle.

Technology system of a bicycle.

To establish whether the students had any understanding of Technological Systems, Sarah specifically chose a context that wasn't electronic (bikes). Some questions were to find out "Do they know it or don't they?" and only required a tick. The transformation questions required more understanding and Sarah reports that a lot of students struggled with this part. She added a question on motion because it linked to a project they would undertake in Year 8. The Technological Products section included a question on ergonomics so that Sarah could find out, before the first Year 7 project, if students had any understanding of the concept.

Although some students couldn't answer all the Technological Practice questions because they didn't know the terminology, they all attempted Question 24 in which they had to design an improved version of a pen tidy. This provided Sarah with information about their drawing skills, and overall responses in this section ranged from Levels 1 to 3.

The final section 'Prior knowledge' gave Sarah a good insight into which materials the students hadn't experienced and what she could put into her programme. Many of them had used wood, metal and plastics, but few had worked with fabrics, mechanisms or food. Junior students don't do Food Technology due to the lack of facilities but will do so in Year 9, so Sarah included it to get an idea of their experience.

Modifications

The drafting and trialling of the diagnostic tool was an essential part of the process, and Sarah encourages teachers developing their own diagnostic tool to be aware that the tool will develop and change. After trialling, Sarah modified some questions because the students' answers showed that what she thought was obvious was misinterpreted by some of them. This was particularly evident in Question 24: in her trial tool she referred to a 'desk tidy' but despite having seen the photo of a pen tidy on the previous page some students sketched images of a table top instead. Sarah realised that she needed to ensure her wording was consistent so changed to 'pen tidy' in this question too.

The systematic functions of a torch.

The systematic functions of a torch.

Sarah had a clear idea of the answer she wanted from the Technological Systems question, in which students had to put pictures in order to show how a torch works, but some decided the batteries were the first part of the system while others put the switch. As the question could be interpreted either way (the physical order of the parts or the order in which they affect each other) to create a logical answer, Sarah decided to mark them both correct.

Other changes related to wording: students had to identify functional and physical features of an umbrella but were confused about 'white fabric' so Sarah has made this into two separate words so that they can classify white as a physical feature and fabric as a physical/functional feature.

Plastic materials used to make a toothbrush.

Plastic materials used to make a toothbrush.

In the 2009 Technological Products section, Sarah asked the students to identify the performance properties of a toothbrush. She soon realised her mistake – that she meant the performance properties of the plastic as a material, not the toothbrush itself. The performance properties of materials are different, but closely linked, to the attributes of a product. To fix this she split the 2010 question into two parts: identify the performance properties of plastic and then identify the appropriate properties for plastic within a toothbrush.

The next step

Sarah says she had underestimated how long it would take to mark such a huge document and that by mid-year she hadn't had time to fully complete her analysis. Some answers could be easily tallied with a tick, but others required more consideration of whether a student had fully or partly understood a particular concept.

Having realised that checking over 80 answer sheets is a hugely time-consuming task, Sarah is looking at how she might use the diagnostic tool in a different way to get the same results. She plans to split it into sections with each having a focus Component. Sarah could then give a ten-minute pre-test on just one or two sections in the days or weeks leading up to when she needs that information. If, for example, her programme of learning has Technological Systems as a focus for Term 3 then she could do a pre-test of questions 13-20 at the end of Term 2.

Sarah also hopes to include the diagnostic tool with students' self assessment and class discussion as other ways of establishing her students' future learning needs.

Return to top ^