Te Kete Ipurangi Navigation:

Te Kete Ipurangi

Te Kete Ipurangi user options:

Ministry of Education.
Kaua e rangiruatia te hāpai o te hoe; e kore to tātou waka e ū ki uta

Discovering students' prior technology knowledge


Download the Year 7 Diagnostic test
 (PDF file, 1.6MB)

Planning a technology programme for a new intake of students can be problematic. Teachers often don't know about their students' prior experience and knowledge and know they can't assume the level of technology the students may, or may not, have done previously. Without this information it is difficult to target the level students are working at and where they need extra support, and finding it out can be very time-consuming.

Dilworth School, an independent boys' school run by a trust board, takes its students from all around the North Island. This huge catchment means that each new student usually comes from a different primary school. Teacher Sarah Blenkiron points out she couldn't realistically gather data from all these schools about students' experience in technology. This meant she could only identify their experience and knowledge later in the term, and then modify her programme to better reflect the level at which they were capable of working.

Students in the junior school start at year 5 or year 7. About two-thirds of all year 7 students are new to the school, so with both intakes Sarah has to determine their understanding of technology and the level at which they are working. Sarah found anecdotally that a lot of year 7 students coming into the school, "Don't have the faintest idea of what technology is."

Creating a diagnostic tool

When Sarah was part of a Resource development and facilitation: Years 7-10 technology project, she decided to create a diagnostic tool that would help her determine the level of understanding in her three year 7 classes. This would be in the form of a questionnaire for all year 7 students, regardless of whether they were new or not.

Sarah started drafting the tool mid-year and wanted to focus on the strands and their components. She worked through several drafts and trialled different versions before she had it ready for her year 7 students.

The diagnostic tool was planned for a one-hour lesson and looked at all eight components of the three strands. Most questions were directly linked to the components, at curriculum levels 1 and 2. A few questions were based around level 3 or level 4 concepts (such as the manipulation of materials within the technological products component). Other questions were linked to projects that the students would be working on later that term.

The accepted technology terminology was used even though some students wouldn't be aware of the meaning. Sarah noted that this could show what they'd been taught, if anything, or whether they could work out a viable answer.

The questionnaire put slightly more emphasis on the technological practice strand. Sarah introduced this by using an existing product with which the students were familiar (a pen tidy). The questions in this section looked at both the overall process of developing a product and some of the different aspects involved in it. Even the boys new to technology could attempt some of this although they may not be able to answer the questions with specific technology vocabulary such as stakeholder and modelling.

Sarah wanted the diagnostic tool to be more than a pen-and-paper test in order to retain student interest. She incorporated a variety of activities including multiple choice, matching descriptions, rearranging pictures, one-word answers, full-sentence answers, and drawing diagrams.


Sarah considered how she could most effectively present and run the diagnostic tool. She decided to give the pre-test (so-called because Dilworth students are familiar with this term) in the first technology lesson. She took the test to the students' classroom so that items in the technology workroom wouldn't influence the students' responses.

Technological outcomes from the street perspective.

Technological outcomes from the street perspective.

She brought a range of resources into the room for the students to look at, touch, or take to their desks for a closer look. Some, for example, took the opportunity to disassemble a torch to examine it. Some questions related to photographs. Sarah says she found a New Zealand street scene image with lots of different outcomes for the students to identify as natural or technological outcomes. She also had larger colour copies of all the photos in the diagnostic tool for the students to look at.

The students were reassured that the pre-test wasn't an exam but for programme planning with each question linked to a different strand of the curriculum. Sarah explained that she wouldn't tell them what the terminology meant and that if they hadn't been taught something or didn't know an answer then they could just say they didn't know or leave it blank.

Questions were allowed, but anyone asking something like "What is a technological outcome?" was told that not knowing was fine and to put that down. As in every class, there were students who finished early and others who ran out of time. Sarah encouraged the latter to leave questions they didn't understand and move on to others, while those who had finished worked on a research task related to their first project.



Sarah found the results useful for refining her programme. She noted it gave her a good understanding of the level at which the boys were working and where they needed more help to get to the next level. In some questions, some students weren't at Level 1, although this varied depending on the question. For example, most got the question "Give examples of the street scene that are technological outcomes" correct. Sarah says this "showed me that even if they didn't understand the terminology they could figure it out and make a really good stab at it, so that was quite encouraging as well".

What is technology?

Test answers were, as expected, quite varied. The question "What is technology?" led to replies such as "It's a subject at school", "An electrical device that will help someone", and "It's something that makes things easier for the human race". The last answer showed that some students did have a developing idea of technology being intervention by design.

Technology system of a bicycle.

Technology system of a bicycle.

Technological systems

To establish whether the students had any understanding of technological systems, Sarah specifically chose a context that wasn't electronic – a bicycle. Some questions were to find out "Do they know it or don't they?" and only required a tick. The transformation questions required more understanding and Sarah reports that a lot of students struggled with this part.

She added a question on motion because it linked to a project they would undertake in year 8.

The technological products section included a question on ergonomics so that Sarah could find out, before the first year 7 project, if students had any understanding of the concept.

Technological practice

Some students couldn't answer all the technological practice questions because they didn't know the terminology. All students attempted question 24 in which they had to design an improved version of a pen tidy. This provided Sarah with information about their drawing skills. Overall responses in this section ranged from levels 1 to 3.


The final section on prior knowledge gave Sarah a good insight into what contexts and materials the students hadn't experienced. Many of them had used wood, metal, and plastics, but few had worked with fabrics, mechanisms, or food. Junior students don't do food technology due to the lack of facilities but will do so in year 9, so Sarah included it to get an idea of their experience.


Consistent wording

The drafting and trialling of the diagnostic tool was an essential part of the process. Sarah encourages teachers developing their own diagnostic tool to be aware that the tool will develop and change. After trialling, Sarah modified some questions because the students' answers showed that what she thought was obvious could be misinterpreted. This was particularly evident in Question 24. In the trial tool she referred to a desk tidy but despite having seen the photograph of a pen tidy on the previous page some students sketched images of a table top instead. Sarah realised that she needed to ensure her wording was consistent so changed to pen tidy in this question too.

In one of the questions students had to identify functional and physical features of an umbrella but were confused about the option "white fabric". Sarah has made this into two separate words so that they can classify white as a physical feature and fabric as a physical/functional feature.

The systematic functions of a torch.

The systematic functions of a torch.

Multiple answers showing knowledge

Sarah had a clear idea of the answer she wanted from the technological systems question, in which students had to put pictures in order to show how a torch works. Some students decided the batteries were the first part of the system while others put the switch.

As the question could be interpreted either way (the physical order of the parts or the order in which they affect each other) to create a logical answer, Sarah decided both were correct.

Breaking questions into parts

Plastic materials used to make a toothbrush.

Plastic materials used to make a toothbrush.

In the technological products section, Sarah asked the students to identify the performance properties of a toothbrush. She soon realised her mistake. She had meant the performance properties of the plastic as a material, not the toothbrush itself. The performance properties of materials are different, but closely linked, to the attributes of a product.

To fix this she split the question into two parts: identify the performance properties of plastic and then identify the appropriate properties for plastic within a toothbrush.

The next step

Sarah says she underestimated how long it would take to analyse such a huge document. By mid-year she hadn't had time to fully complete her analysis. Some answers could be easily tallied with a tick, but others required more consideration of whether a student had fully or partly understood a particular concept.

Sarah looked at how she might use the diagnostic tool in a different way to get the same results. She planned to split it into sections with each having a focus component. Sarah could then give a ten-minute pre-test on just one or two sections in the days or weeks leading up to when she needs that information. If, for example, her programme of learning has technological systems as a focus for Term 3 then she could do a pre-test of questions 13-20 at the end of Term 2.

Sarah also hopes to include the diagnostic tool with students' self assessment and class discussion as other ways of establishing her students' future learning needs.

Return to top ^