Te Kete Ipurangi Navigation:

Te Kete Ipurangi

Te Kete Ipurangi user options:

Ministry of Education.
Kaua e rangiruatia te hāpai o te hoe; e kore to tātou waka e ū ki uta

Ask an expert

You can search for questions and answers by using keywords and/or refine your search by selecting from the options below. 

There are 141 results.

  • Question

    AS 91071 Implement basic procedures to produce a specified digital outcome. The students have created an information outcome whereby an Access database is merged in a mail merge. I was wondering what needs to be submitted for moderation purposes? I have the database and the merged document. Is it necessary to have notes and the letter prior to merging? Is it enough to submit the finished database and the merged letter in this case a pool party invite?


    A copy should be submitted of the specifications that were to be met and the techniques that were to be applied.
    Generally copies of the students’ outcomes (that is, that show the database merged into the mail merge) will allow NZQA to see that appropriate software applications and features were selected, and file management procedures, design elements and formatting techniques were applied.

    There may need to be some documentation (from the student or confirmed by the assessor) to show data integrity and testing procedures were applied to ensure the outcome-met specifications, and that legal, ethical and moral responsibilities were followed.

    For NZQA to be able to validate Merit or Excellence, the assessor will generally need to show how they determined students demonstrated the required accuracy, independence (Merit) and economy of resources (Excellence). Individual and annotated TKI assessment schedules are often used to good effect by assessors.

    If submitting the work in a digital format, provide a digital copy of the database and the merged letter along with the supporting information outlined above.

    For this standard printouts are suitable but need to include the database and the merged documents along with  the other information as  outlined above.

    Digital submissions can be on CD USB or online and are encouraged.

    Further information is available at NZQA, Preparing digital submissions for moderation.

  • Question

    For the Digital Technologies 1.45, 2.45 planning for a program assessments, I have a couple of queries regarding the part of the standard that involves specifying test cases. 1) In 2.45 the standard says: - specifying a set of expected input cases for testing the program. (and then M/E equivalents) So, it says specifying "A" set of expected input cas"ES", and I know usually if something is plural that means at least 2 etc. So my question is, in this situation, do they need to specify multiple values for each input, or could they specify one expected (or one of each type for M/E) value for each input and that would be a set containing multiple values? For example in a quiz, for achieved, do they need to specify 2 values for each input like name, Q1, Q2 etc. or is one value for every input "a set" of input cases? 2) For the .45s, are they supposed to actually run through their plan (pseudocode/flowchart) with these testing values and record "actual results" for what their program *plan* does, or is this supposed to be just a testing plan with intended values, that would then be implemented in 1.46/2.46 etc. with actual results recorded when they have been entered into the program? If the latter, could students use the same table for 1.45/2.45 and 1.46/2.46 but just add in an "actual result" column, or should they do the table again and record it more as a log? 3) Compared to the 1.45 wording (specifying a set of test cases with expected inputs for testing the program) is there actually a difference/step up in this part of the standard between level 1 and 2? Because the wording sounds almost identical to my untrained ear. But if it is the same thing I'm not sure why it would be written differently at all. 3a) If you give students a testing template e.g. a table pre-headed up with "Description of input", "Intended values to enter", "Expected result", and they fill it in themselves (at A/M/E level depending on values), would that prevent them from being able to get M/E (at any of level 1-3) because you've provided a template? 4) Lastly, one query about how this all then ties in to 1.46/2.46 etc.: When the students carry out the actual testing of their program, are they required to record it all as a log with specific values written down, or could they simply run through their code using expected, then boundary, then invalid values, and do something like take screenshots or screencast (video) their run-throughs? I ask because we do a quiz in python at level 1 and when they test with a table of each input (name, q1, q2 etc.) and have to put a row for each input (name, questions 1-10, playing again etc.), then 6 values for each (2 expected, 2 boundary, 2 invalid), and assuming some have errors and need fixing/testing again, sometimes the tables take up like 4+ pages from my excellence kids and the achieved kids get lost at documenting each question with multiple values. I realise this is long, apologies for that but I have heard so many different things from so many people I have asked, and while that flexibility is one of the advantages of NCEA it also makes it very tricky to figure out what is required and what is sugar on top, and I want to be very clear on the minimum requirements to help get the students at the bottom through to A more easily. Thanks in anticipation!


    1. At least 2 values for each Question  (one correct / one incorrect). For example: Q1    2 + 3  = a     case 1   input a = 5   output = “correct”; Case 2 input a = 6   output = “ incorrect” 

    2.  It is intended that the students use the table of value/test cases from 1.45/2.45 for the programming testing 1.46/2.46 and add actual result.  (Word or Excel file the norm) 

    3.  The wording is very similar (looks like identical wording used)

    3a. Levels 1–2, a template is fine to use (Dream Pizza Program 2.46 provides sample test cases). At Level 3, I would say a template should not be allowed.

    4. A testing log (eg Excel table of all inputs tested and outcomes) provides more secure evidence of expected, boundary and invalid inputs.

    Screenshots are also useful evidence of student documentation for moderation purposes and illustrating successful student outcomes

  • Question

    2.21 AS91345 Garments with special features. Would "tailored collars and cuffs" be two special features or are they counted as one? It is stated on the achievement standard they are: 1. Collar 2. Cuffs or 1. Collar and cuffs 2. Set in sleeves


    Collars and cuffs are considered two separate special features. Your first option is correct.

  • Question

    Can you please clarify the intent of AS 91069 and the term "student generated". In particular, note 4 which states "The organised body of work being promoted must be student-generated in response to a design brief and may also include design work sourced through research." We assume the intent of the standard is the way the students have presented the work and how they have created a composition not the content being presented i.e this does not have to be their own work it could be another designers work as shown in the exemplars online. Are we correct in saying that the intent is on 'layout, composition and visual impact' for AS91069; whereas 'modes and media' come more into the mix for AS91343.


    The intent of AS 91069 standard is that students will present aspects of their design ideas/work. This could come from the design ideas or final solution or from the research that they have done – that is, from anywhere in the design pathway.

    The exemplars are still the student’s own work and they are a part of their design portfolio/process and not another designers work. The students that did these samples put together the presentation from images and component pieces.

  • Question

    For the Level 2 Planning standard (As 91355) the outcome our students are heading towards is the creation of a multipage website. In the standard it says "Planning tools may include but are not limited to: brainstorms, mind-maps, idea banks, reflective journals and scrapbooks, plans of action, Gantt charts, flow diagrams, graphical organisers, and spreadsheets and databases." Would students using a layout diagram of their website (i.e. what each page will look like / conceptual type diagram) also count as a planning tool? The diagram would show the placement of elements of the webpage such as header, column, images, text, footer, etc. This is what a web designer would do to help plan a website.


    AS 91355 requires students to: "Use selected planning tools to set achievable goals, establishing resources required and determining critical review points." (Explanatory Note 2)

    The layout diagram described is generally used as a functional modelling tool to present designs and in some instances used to illicit feedback from key stakeholders. If it was to be considered a planning tool, it would need to be seen to be aiding goal setting and/or the establishment of resources required, and/or the determining of critical review points.

  • Question

    Achievement Standard 91617 (external) Can you clarify the difference between "appraise the design of a tech outcome using contemporary design judgement criteria" and "evaluate the quality of the design of a tech outcome using design judgement criteria", as found in the L3 Common Assessment Guide, Candidate guidance for producing the report. My thesaurus gives evaluate as having a similar meaning to appraise.


    The interpretation of these words is also related to the phrases that follow them in the standard criteria.

    Appraising the design of a technological outcome using design judgement criteria: For this criteria at achieved, an appraisal is considered an opinion related to the design judgement criteria.

    Evaluating the quality of the design of a technological outcome using design judgement criteria: For this criteria at merit, evaluating is considered to be the application of the design judgement criteria to produce a critique that is more "objective", "testable", and possibly enumerated.

    The judgement being made by an assessor is that the quality of the critique has improved as the critique has moved beyond informed opinion. The critique has more depth as a better understanding of "design judgement criteria" and/or a better application of the design judgement criteria inform it.

    Additional information is available in the NZQA 2013 Assessment report for AS 91617 Undertake a critique of a technological outcome's design. 

  • Question

    Does the prototype actually have to "work"? Would it be "fit for purpose" if it didn't ?


    A prototype is a finished outcome ready to be trialled in situ. So the intent should be that it will work. It would have to work "well enough" to be trialled.

    At level 2/3 NCEA, students undertake prototyping to gain evidence of fitness for purpose. The student is required to explain any decisions to accept or modify the prototype.

    The approach to prototyping does differ between industries. For example, in the car industry prototyping occurs quickly (as they need the prototype to test), and there will generally be many subsequent modifications.

    In schools, there are time and budget and other constraints when prototyping. A student might make the prototype to the best of their understanding of how it should be and then undertake prototyping. They may then find some aspect doesn't work or isn't quite right as they make a judgment against the brief. The students need to explain any decisions to accept and/or modify the prototype. Responses could include:

    • Although this aspect is not quite right, it is acceptable as is because …
    • While I am not doing it now for time/budget ... reasons, the prototype would need to be modified by doing ... because … (and so on).


    • The student goes on and implements and shows the modification.

    It is expected that the student who carefully works through the development stage will more than likely make a prototype that does work. That is because they have undertaken the evaluating, trialling, selecting, and so on of materials, components, tools, equipment, and practical techniques and processes as required by the standard. 

  • Question

    Can you give examples for Tech systems L5 in an electronics context please. I find this really confusing – "Identify subsystems within technological systems and explain their transformation and connective properties".


    In an electronics context all electronic circuits can be subdivided into sub circuits called "subsystems". A subsystem is a circuit that has a defined input, transformation, and output – that is a subsystem has a single defined function (for example, voltage transformation, motor driving, sensor input, and amplification).

    Subsystems are connected to each other to build up a full electronic system (or "environment") by designing an interface or link. The interface has to be designed and fine-tuned to allow the subsystems to transfer their output into each other’s input effectively and efficiently. Historically in electronics, subsystems and interfaces are hardware-based, although now with the dominance of embedded software (code programmed into a microcontroller), interfaces can be software-based as well as hardware-based (for example, the RS232 protocol link).

    A simple example from biology is the human body, which is a biological system. The subsystems can be thought of as the respiratory, circulatory, excretory, and so on subsystems. All the subsystems of the body have to interface effectively and efficiently for the body to work.

    Another example is in the subsystems of the automobile. If you don’t interface these correctly with each other you will have a car that will definitely need to go in for a "tuneup" (a tuneup is an interfacing process).

    Some teachers find that starting with familiar examples like these is a helpful approach to discussing the concepts of subsystems and interfacing. This demonstrates that the ideas are universal and not just "owned" by electronics and technology.

  • Question

    I have a question regarding AS91053. When it comes to this design element standard, the explanatory notes say: "Design elements may include but are not limited to: line, balance, shape, colour, symmetry, strength, contrast, durability, alignment." I appreciate that is says “may include but are not limited to”. My students have been studying video game design so my question is, if we are looking at the design elements of video games, do we still need to consider the traditional CRAP and SCABS of design? Or can we look at those aspects that relate to gaming? Or do we have to do both? http://wps.pearsoncustom.com/wps/media/objects/8771/8981685/SG140_Ch01.pdf The link given outlines the required elements for game design. They go something like this: 1. Play: Games arise from the human desire for play and from our capacity to pretend. Play is a wide category of nonessential, and usually recreational, human activities that are often socially significant as well. Pretending is the mental ability to establish a notional reality that the pretender knows is different from the real world and that the pretender can create, abandon, or change at will. Playing and pretending are essential elements of playing games. Both have been studied extensively as cultural and psychological phenomena. 2. Pretending (see above) 3. A Goal (objective): What do you want your player to achieve, what do you want them to feel or experience? Do you want them to relax, to take their breath away, emotional, invincible, rock the world, kill everything that moves? How long will people play for – hours or a quick fix? These decision must be made to help pick the genre of your game as well as your target audience. 4. Rules: Generally speaking players expect that the game will have rules that are fair. However, sometimes players may choose to change the rules. (See Changing the Rules, page 10). 5. Gameplay: The challenges that a player must face to arrive at the objective of the game. The actions that the player is permitted to take to address those challenges. 6. Symmetry and Asymmetry. 7. Competition and Cooperation. Also, in one of the Marking Exemplars on TKI it states: The report describes experiences you would expect to come from a course of instruction derived from the technology learning area in the NZC: A critique of existing products embedded within a student’s tech practice; Testing and trialling within a modelling process; Developing a conceptual design; Development and/or refinement of a brief, material selection, and/or construction techniques used; Development of a one-off solution or prototype; An evaluation of a student’s one-off solution or prototype. I am not sure how developing a conceptual design, the modelling process, evaluation of a one-off solution, and so on are included in this standard about demonstrating an understanding of design elements. Or are these mentioned as means through which the student might have acquired knowledge of the required design elements?


    Students are required at achieved to describe the elements that underpin design within a specified context and describe considerations used to determine the quality of a design within a specified context.

    Explanatory note 3 of the standard outlines the definitions of subjective and objective aspects to determine the quality of design. See:

    • Considerations used to determine the quality of a design include subjective and objective aspects.
    • Subjective aspects are those that are based on personal, cultural, and sociological factors (e.g. preference, style, fashion, taste, identity, image, perception).
    • Objective aspects are those that can be established in a quantifiable sense (e.g. ergonomics, anthropometrics, purpose, operation, cost, production) and which are based on physical conditions.

    The elements outlined for game design generally fit into the definition of subjective aspects used to measure the quality of design in the context of gaming. 

    Students are encouraged to demonstrate understanding based on their own technological experiences. Developing a conceptual design, the modelling process, and evaluation of a one-off solution may be part of a student’s technological experiences. Technological experiences can also include visiting speakers, a site visit, research of existing products, and knowledge gained from their whānau/family and wider community contacts. Students can demonstrate understanding of design elements and the quality of design by applying or discarding this knowledge within their own technological practice

  • Question

    What are some examples of cultural appropriateness of trialling processes in a textiles context?


    Trialling in textiles includes fitting and modeling of garments or textile outcomes. Examples of cultural appropriateness of trialling processes when addressing the concept of fitness for purpose in its broadest sense could include:

    • demonstrating a sensitivity to privacy, modesty, and personal space when carrying out fittings in a classroom setting
    • working in timeframes for fittings that suit the wearer
    • not requiring excessive fittings
    • adhering to protocols that may derive from religious beliefs regarding modesty and body exposure 
    • adhering to cultural protocols about the use of classroom fittings (for example, sitting on tables is unacceptable in Māori culture).

    Sometimes the boundaries between personal, cultural, religious, and ethical beliefs and practices are not always clear and examples will overlap.

Return to top ^